Exec tSQLt.Faketable crashing our original table constraint and data - sql-server

During debugging the tsqlt code, I have directly run the below statement without wrapping it as sp and my original table constraint got deleted and some data missing from the original table.
Exec tSQLt.FakeTable #TableName = N'DBO.Employee', #Identity=1;
Exec tSQLt.FakeTable #TableName = N'DBO.Salary', #Identity=1;
How do I prevent running faketable statement in tsqlt is impacting the original table?

There is no way to prevent executing tSQLt.FakeTable outside of the framework. There are also good reasons to not prevent that, so I do not think that adding that functionality is the right approach.
However, if you’re using the newest version of tSQLt, you can use tSQLt.UndoTestDoubles to get the original object(s) back.

Ugh, been there... You can't prevent it, short of contributing to the project and putting a pull request in to add the functionality.
FakeTable creates a backup of your original table so you should be able to get the original table back. These backup table names start with tSQLt.tempobject and end in an identifier. You can delete the new "fake" table (which now has the name of your original table) and rename the tempobject table if/when you find it.
Something I've done in the past is to query for a column that I know is in the table to find the name of the tSQLt table:
SELECT t.name
FROM sys.columns c
INNER JOIN sys.tables t ON t.object_id = c.object_id
WHERE c.name = 'SomeCol';

Related

Need to identify if a T-SQL query I run is modifying any records without access to logs

I'm looking to modify a stored procedure that has a long chain of stored procedures within it. I'm not sure what parts of this proc will cause updates to live tables though. What I want to do is maintain all of the tempdbs it creates and select from them, but prevent any record changes via update, insert, delete, etc. Ideally I want to be able to see this info from directly inside SSMS without DBA level permissions. I'm running this on a test DB, so it would also be appropriate if something could tell me what tables were changed after the fact. I could then find the update, prevent it, roll back to a snapshot, then run it again until it shows 0 changes.
I've tried going through by hand and making the changes by searching for keywords like Update, Into, and Exec. However, this involves a lot of human judgment and adds a lot of room for human error. I've also considering wrapping this in a begin..rollback transaction to undo any unintended changes, but this proc can take upwards of 10 minutes to run and I don't want an open transaction that long. I'm also not entirely certain that there isn't a commit tran hiding in one of the stored procedures called by this one.
Any help provided would be greatly appreciated, thanks.
As long as the stored procedures don't have dynamic SQL and such, you could use the built in utilities to recursively find any referenced tables and stored procedures. This code will show referenced columns and the type of action. I have never used this on a large scale, so definitely spot check as you go.
MSDN Documentation
CREATE TABLE dbo.someData
(
id INT,
name VARCHAR(100)
)
GO
CREATE OR ALTER PROC dbo.doSomething
AS
SELECT name FROM dbo.someData
UPDATE d
SET d.id = 2
FROM dbo.someData d
GO
SELECT
--SP, View, or Function
ReferencingName = o.name,
ReferencingType = o.type_desc,
--Referenced Field
ref.referenced_database_name, --will be null if the DB is not explicitly called out
ref.referenced_schema_name, --will be null or blank if the DB is not explicitly called out
ref.referenced_entity_name,
ref.referenced_minor_name,
--these will tell you what it's doing
ref.is_updated,
ref.is_selected
FROM
sys.objects o
CROSS APPLY
sys.dm_sql_referenced_entities('dbo.' + o.name, 'Object') ref
WHERE
o.type IN ('P')
AND o.name LIKE '%something%'

How could our database be causing SqlPackage to fail? (SQL72018)

Hoping someone else has come across this, because Google returns only nine results for this error! Information about SqlPackage seems a little scant still.
We're currently going through the process of migrating to a continuous deployment environment. As part of this, we're using database projects to store database schema and the build server is using SqlPackage.exe to generate an upgrade script by comparing each project's .dacpac file with its associated schema template database that's hosted on the server.
We have six databases so far (with more to come) and they all work fine apart from one, which throws the following error when SqlPackage is modelling the 'target' database:
Error SQL72018: Trigger could not be imported but one or more of these objects exist in your source.
The only thing we can think of is that it's a problem with the size of the target database; perhaps SqlPackage is running out of memory? It's the largest database schema we have, so it's certainly feasible. If it's down to a memory limitation of SqlPackage, how do we go about increasing it?
We're going to start removing objects from the target database and the source project to see if we can establish whether it's down to size or a specific schema object, but any ideas and suggestions in the meantime would be greatly appreciated!
Update
I just tried removing all triggers from the target database, and now it spits out the upgrade script with no errors. Next I'll try removing only half of them, and see if I can narrow it down to one specific trigger. I suspect it may simply be the size of the schema, which goes back to the SqlPackage memory question.
Okay .. worked it out.
We went through a process of dropping triggers until we narrowed it down to a single trigger that was causing the error. Turns out that there was something rather wrong with it - it seems that it's meant to be attached to a table, but isn't (i.e. it doesn't appear in the triggers list). So I'm guessing this minor corruption caused SqlPackage to fail.
In case anyone finds it useful, this is the script I used to drop ranges of triggers, which helped me find the culprit:
http://www.codeproject.com/Tips/662699/Drop-all-Triggers-belonging-to-any-schema-in-MS-SQ
Use ClaimsSqlPackageTest
DECLARE #SQLCmd nvarchar(1000)
DECLARE #Trig varchar(500)
DECLARE #sch varchar(500)
Declare #count int = 0
DECLARE TGCursor CURSOR FOR
SELECT ISNULL(tbl.name, vue.name) AS [schemaName]
, trg.name AS triggerName
FROM sys.triggers trg
LEFT OUTER JOIN (SELECT tparent.object_id, ts.name
FROM sys.tables tparent
INNER JOIN sys.schemas ts ON TS.schema_id = tparent.SCHEMA_ID)
AS tbl ON tbl.OBJECT_ID = trg.parent_id
LEFT OUTER JOIN (SELECT vparent.object_id, vs.name
FROM sys.views vparent
INNER JOIN sys.schemas vs ON vs.schema_id = vparent.SCHEMA_ID)
AS vue ON vue.OBJECT_ID = trg.parent_id
OPEN TGCursor
FETCH NEXT FROM TGCursor INTO #sch,#Trig
WHILE ##FETCH_STATUS = 0
BEGIN
SET #SQLCmd = N'DROP TRIGGER [' + #sch + '].[' + #Trig + ']'
If #count >= 155 AND #count <= 160 Begin
EXEC sp_executesql #SQLCmd
PRINT #SQLCmd
End
Set #count = #count + 1
FETCH next FROM TGCursor INTO #sch,#Trig
END
CLOSE TGCursor
DEALLOCATE TGCursor
Putting a trigger on a table that has the extended property 'microsoft_database_tools_support' is a simple way to produce this error. For example putting a trigger on the sysdiagrams table.
Apparently this extended property makes the object invisible to VS and the existence of a trigger on an invisible object is confusing. You can either make both objects visible by removing the microsoft_database_tools_support properties or make them both invisible by adding the property to the other object.
So in my example of putting a trigger named iut_sysdiagrams on sysdiagrams I'd use this:
EXEC sys.sp_addextendedproperty #level0type = N'SCHEMA' ,#level0name = [dbo]
,#level1type = N'TABLE' ,#level1name = [sysdiagrams]
,#level2type = N'TRIGGER' ,#level2name = [iut_sysdiagrams]
,#name = N'microsoft_database_tools_support' ,#value = 1;

How to delete all tables from db? Cannot delete from sys.tables

How can I perform this query on whatever way:
delete from sys.tables where is_ms_shipped = 0
What happened is, I executed a very large query and I forgot to put USE directive on top of it, now I got a zillion tables on my master db, and don't want to delete them one by one.
UPDATE: It's a brand new database, so I don't have to care about any previous data, the final result I want to achieve is to reset the master db to factory shipping.
If this is a one-time issue, use SQL Server Management Studio to delete the tables.
If you must run a script very, very carefully use this:
EXEC sp_msforeachtable 'DROP TABLE ?'
One method I've used in the past which is pretty simple and relatively foolproof is to query the system tables / info schema (depending on exact requirements) and have it output the list of commands I want to execute as the results set. Review that, copy & paste, run - quick & easy for a one-time job and because you're still manually hitting the button on the destructive bit, it's (IMHO) harder to trash stuff by mistake.
For example:
select 'drop table ' + name + ';', * from sys.tables where is_ms_shipped = 0
No backups? :-)
One approach may be to create a Database Project in Visual Studio with an initial Database Import. Then delete the tables and synchronize the project back to the database. You can do the deletes en masse with this approach while being "buffered" with a commit phase and UI.
I am fairly certain the above approach can be used to take care of the table relationships as well (although I have not tried in the "master" space). I would also recommend using a VS DB project (or other database management tool that allows schema comparing and synchronization) to make life easier in the future as well as allowing version-able (e.g. with SCM) schema change-tracking.
Oh, and whatever is done, please create a backup first. If nothing else, it is good training :-)
Simplest and shortest way I did was this:
How to Rebuild System Databases in SQL Server 2008
The problem with all other answers here is that it doesn't work, since there are related tables and it refuses to execute.
This one, not only it works but actually is what I am looking for: "Reset to factory defaults" as stated in the question.
Also this one will delete everything, not only tables.
This code could be better but I was trying to be cautious as I wrote it. I think it is easy to follow an easy to tweak for testing before you commit to deleting your tables.
DECLARE
#Prefix VARCHAR(50),
#TableName NVARCHAR(255),
#SQLToFire NVARCHAR(350)
SET #Prefix = 'upgrade_%'
WHILE EXISTS(
SELECT
name
FROM
sys.tables
WHERE
name like #Prefix
)
BEGIN
SELECT
TOP 1 --This query only iterates if you are dropping tables
#TableName = name
FROM
sys.tables
WHERE
name like #Prefix
SET #SQLToFire = 'DROP TABLE ' + #TableName
EXEC sp_executesql #SQLToFire;
END
I did something really similar, and what I wound up doing was using the Tasks--> script database to only script drops for all the database objects of the originally intended database. Meaning the database I was supposed to run the giant script on, which I did run it on. Be sure to include IF Exists in the advanced options, then run that script against the master and BAM, deletes everything that exists in the original target database that also exists in the master, leaving the differences, which should be the original master items.
Not very elegant but as this is a one time task.
WHILE EXISTS(SELECT * FROM sys.tables where is_ms_shipped = 0)
EXEC sp_MSforeachtable 'DROP TABLE ?'
Works fine on this simple test (clearing a on the second loop after failing on the first attempt and proceeding onwards to delete b)
create table a
(
a int primary key
)
go
create table b
(
a int references a (a)
)
insert into a values (1)
insert into b values (1)

How to completely clean out a SQL Server 2005 database?

I made a copy of a DB that is used for a web app to make a new instance of this web app. I am wondering how do I remove all data and transactions and what not so that it is just a clean empty shell of tables ready to be written with new data?
Sql Server Database Publishing Wizard. Create a script with just the schema, specifying to drop the existing objects.
run this script:
select 'TRUNCATE TABLE ' + name from sysobjects where xtype='U'
and then paste the results into a new script and run that
(And for God's sake, be careful!) :)
EDIT
From comments it seems TRUNCATE can't delete rows from tables with foreign keys.
You could use
select 'DELETE FROM ' + name from sysobjects where xtype='U'
and you would also have to rearrange the output to delete from child tables first. Others have suggested scripting a clean database and that is probably a better idea TBH.
Uncomment out the -- to actually run... BE CAREFUL!!
Declare #t varchar (1024)
Declare tbl_cur cursor for
select TABLE_NAME from INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'
OPEN tbl_cur
FETCH NEXT from tbl_cur INTO #t
WHILE ##FETCH_STATUS = 0
BEGIN
--EXEC ('TRUNCATE TABLE '+ #t)
FETCH NEXT from tbl_cur INTO #t
END
CLOSE tbl_cur
DEALLOCATE tbl_Cur
EDIT:
In answer to the comment question... damn good question. I imagine you could find all the foreign keys and save them off
SELECT 'ALTER TABLE ' + b.TABLE_NAME + ' WITH CHECK ADD CONSTRAINT [' + a.CONSTRAINT_NAME + '] FOREIGN KEY '
+ c.COLUMN_NAME + ' REFERENCES [' + d.TABLE_NAME +'] ([' + e.COLUMN_NAME + '])'
FROM INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS a
INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS b
ON a.CONSTRAINT_NAME = b.CONSTRAINT_NAME
INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS d
ON a.UNIQUE_CONSTRAINT_NAME = d.CONSTRAINT_NAME
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE c
ON a.CONSTRAINT_NAME = c.CONSTRAINT_NAME
INNER JOIN (
SELECT
f.TABLE_NAME,
g.COLUMN_NAME
FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS f
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE g
ON f.CONSTRAINT_NAME = g.CONSTRAINT_NAME
WHERE f.CONSTRAINT_TYPE = 'PRIMARY KEY'
) e
ON e.TABLE_NAME = d.TABLE_NAME
ORDER BY a.CONSTRAINT_NAME
and then you could drop all of them (I dont believe it matters in which order you drop the constraints)
SELECT 'ALTER TABLE ' + col.TABLE_NAME + ' DROP CONSTRAINT ' + u.CONSTRAINT_NAME
FROM INFORMATION_SCHEMA.COLUMNS col
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE u
ON col.TABLE_NAME = u.TABLE_NAME
AND col.COLUMN_NAME = u.COLUMN_NAME
INNER JOIN INFORMATION_SCHEMA.table_constraints t
ON u.CONSTRAINT_NAME = t.CONSTRAINT_NAME
WHERE t.CONSTRAINT_TYPE = 'FOREIGN KEY'
and THEN use the first cursor to truncate all the tables. Then you can use the results of the script you saved off to recreate all of the FK relationships.
I don't know of any one step magical silver bullet command to do so, but if you want to preserve your tables/schemas, you'd probably need to script a truncate table for each.
Alternatively, you could script out the whole database and use that one script to regenerate a new database after you drop the "used" one. Making sense?
Right click on the Database you want to deal with, select Script Database As (3rd option from the top), then the option DROP and CREATE to ... at which point maybe you want to do this to a file or the clipboard and paste it somewhere.
Then, with this file handy as your script, run it to create a clean nice database.
You could create an empty database, and then use something like SQL Compare to compare your existing database against the empty one to generate scripts to recreate the database entirely from scratch.
You could also use the following SQL:
--// Switch to the database to be modified
USE DatabaseName;
--// The following commands need to be run for each table
--// You could perhaps automate this by using a cursor
--// First truncate the table and remove all data
TRUNCATE TABLE MyTable;
--// Also reset the identity seed
DBCC CHECKIDENT (MyTable, reseed, 1)
I'd recommend re-creating the database structure from scratch, rather than doing a backup-and-restore-to-new-database, as this will give you a completely clean database without any 'residue' (like stuff in the transaction log).
Truncating will work if you no foreign keys defined (And if you don't please please add them).
If your using SQL Server 2005 (08 might be the same), you can generate a script for the entire database, from within Sql Server Management Studio. Right click on the database you want to script.
Then go to tasks, and generate scripts. Script out all the objects, then you can use this script to build a fresh copy of the DB based on just the schema.
This article presents store-procedure without any of the mentioned problems.
The key is to disable referential integrity :)
You may want to consider just generating a t-sql script that only includes the structure from your existing database. The SQL Server Management Console makes this very easy, as you just need to right click on your original database, select 'tasks->generate scripts'. From there, just click through the defaults and select the objects that you want to duplicate (tables, etc).
This generates a nice T-SQL script that you can apply to any blank database, giving you the structure that you are looking for without the data. To me, this seems to be a more appropriate option as compared to truncation.
you would just truncate each table as in
use [dbname]
truncate table [table]
where [dbname] is the name of the copied database, and you would copy the 2nd line for each table in the database.
I'm sure with about 5-10 minutes, you could create a script that would read over all the available tables listed in the systables and use that information to do a while loop so you didn't have to write "truncate table [table]" for each table, but that's the general idea.
--
Ok,
To all that answered after me, I'm getting this mental "guilt" thing going on because I didn't write up that he should have created a TSQL script and re-create a database off of that.
There were several reasons why I didn't go that route.
You don't know what sorts of
"static" data he has in the
database.
He specifically asked
about how to clear the database.
I don't currently know what the #OP
has built into the rest of the
system. there could potentially be
dependencies that he needs a copy of
the original in order to satisfy a
condition.
Had the gentleman asked this in a different manner, I may have indeed answered like so many have and simply stated to script the database out.

Why would IF EXISTS not work?

I have a lot of code I am trying to run where I'm querying the sysobjects table to check if an object exists before I drop it and create it again.
Issue being, sometimes if I go:
if not exists (select name from sysobjects o where o.name = 'my_table' and o.type = 'U')
CREATE TABLE my_table (..)
go
it works, no worries. However, when I came back to run it again, I get this lovely error:
SQL Server Error on (myserver) Error:2714 at Line:10 Message:There is already an object named 'my_table' in the database.
Thanks for that, SQL Programmer. I actually asked for you not to create this table if it already exists. -_-
Any ideas?
the logic to what you are doing doesn't seem quite right. based on your statement:
"I am trying to run where I'm querying the sysobjects table to check if an object exists before I drop it and create it again"
you should simply do a delete followed by a create. This way is usually better because it ensures that the table will be updated. if the table existed and you had changes, you are probably not getting what you want.
The immediate issue you are running into is an assumed db ownership that was not consistent between runs.
based on your clarification below - here is what you can do:
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[XXXX]') AND type in (N'U'))
DROP TABLE [dbo].[XXXX]
GO
CREATE TABLE [dbo].[XXXX(...
GO
you can run this over and over again...
The sybase parsers object validation pass is global and not based on conditional evaluation. Even though your code can not execute CREATE TABLE the statement is still checked for syntax and applicability which fails when the system sees that the table already exists.
The only way around this that I know of is to put your create statements inside of an EXEC() which would be evaluated only if the section was executed.
yes, the entire batch of SQL is normalized and compiled so as to create an "execution plan" for the entire batch. During normalization, the "possible" "create table" statement is a problem if it already exists at compile time.
My solution: rename -
if exists (select 1 from ....)
begin
drop table xyz
create table xyz_zzzz ( ... )
exec sp_rename 'xyz_zzzz','xyz'
end

Resources