SQL - table schema change in trigger then update dependencies (views) - sql-server

Here is the scenario (code is C#):
We allow for the creation of columns on our tables from within our app. When a record is put into a table (our metadata table, going forward called mdt), we create a column on a table that corresponds to the data we just created in mdt. We have a trigger on mdt that does the creation of this column. At the end of this trigger we are updating all the dependencies for the table that just got the newly created column.
In SSMS I can run all the SQL and it works, however in code, all the SQL runs, but the dependencies are not updated. The code runs all SQL in a transaction, however I have run tests in SSMS using transaction and still, the SQL in SSMS works while the code in our app does not. By not work, I mean that the dependencies do not refresh. The column creation works which means that the trigger is firing in both cases which means that the code to try to regenerate the view dependencies is running.
So, in short:
Insert record > Fire trigger > Create Column > Refresh View Dependencies
SSMS - everything works
From code - everything works, except that the refreshing of the view dependencies is not actually refreshing the views
My first thought is that there is a difference in the way the transactions are being done (between code and SSMS), however I added code in the app that ran the script to refresh the view dependencies outside of the transaction that would end up creating the new column and still, the dependencies are not updated, however if I run the same SQL in SSMS, the dependencies are updated fine.
Any ideas would be appreciated.
This is the code we are using to refresh the views:
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[uspRefreshViewDependencies]') AND type in ('P'))
DROP Procedure [dbo].[uspRefreshViewDependencies]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[uspRefreshViewDependencies](
#nvcViewName nvarchar(776),
#bitIncludeCustomerObjects bit = 1
)
AS
/*
This procedure will correct View and UDF dependencies after columns have changed. Given the name of a View
for #nvcViewName, it will detect all of the Views and UDFs (both table-type and inline-table) that reference
them, and call sys.sp_refreshsqlmodule to update them. It runs recursively (so it can correct dependencies
on dependencies on dependencies, and so on.)
if #bitIncludeCustomerObjects is false, the algorithm will skip custom database objects (anything in the dbo
schema that starts with the letter "c".)
This was created for managing Entity Views but it should also work for tables and UDFs.
*/
DECLARE c CURSOR FORWARD_ONLY STATIC FOR
WITH dependencies (referencing_id, referencing_object_name, type, level)
AS
(
SELECT referencing_id, OBJECT_SCHEMA_NAME(referencing_id) + '.' + OBJECT_NAME(referencing_id), SO.type, 1 as level
FROM sys.dm_sql_referencing_entities(OBJECT_SCHEMA_NAME(OBJECT_ID(#nvcViewName)) + '.' + OBJECT_NAME(OBJECT_ID(#nvcViewName)), 'OBJECT') sre
INNER JOIN sys.objects SO ON sre.referencing_id = SO.object_id
WHERE SO.type IN ('V', 'TF', 'IF', 'P')
AND (
#bitIncludeCustomerObjects = 1
OR OBJECT_NAME(referencing_id) NOT LIKE 'c%')
UNION ALL
SELECT SED.referencing_id, OBJECT_SCHEMA_NAME(SED.referencing_id) + '.' + OBJECT_NAME(SED.referencing_id), SO.type, level + 1
FROM dependencies D
INNER JOIN sys.sql_expression_dependencies SED ON D.referencing_id = sed.referenced_id
INNER JOIN sys.objects SO ON SED.referencing_id = SO.object_id
WHERE SO.type IN ('V', 'TF', 'IF', 'P')
AND (
#bitIncludeCustomerObjects = 1
OR OBJECT_NAME(SED.referencing_id) NOT LIKE 'c%')
)
SELECT referencing_object_name, type, Max(level) as level
FROM dependencies
Where (OBJECT_SCHEMA_NAME(referencing_id) <> 'Custom')
Group By referencing_object_name, type
ORDER BY level
DECLARE #nvcDependency nvarchar(776)
DECLARE #nvcType nvarchar(30)
DECLARE #intLevel Int
OPEN c
FETCH NEXT FROM c INTO #nvcDependency, #nvcType, #intLevel
WHILE ##FETCH_STATUS = 0
BEGIN
PRINT 'Refreshing ' + #nvcDependency
EXEC sys.sp_refreshsqlmodule #nvcDependency
FETCH NEXT FROM c INTO #nvcDependency, #nvcType, #intLevel
END
CLOSE c
DEALLOCATE c
Go
I have found out that when running this in SSMS that everything runs as expected, however when run from C#, that the CTE seems to not return any results and never gets into the cursor.

Related

How to Move Data from Transactional Databases to a Master Database with SSIS

I am very new to SSIS and I need to write a package that will move data from transactional databases to a master database. We have a transactional database per plant and the schema for all of these is identical. I need to go through each table in each database and copy all the data that hasn't been marked as exported to its corresponding table in the master database. After the records are successfully copied to the master database they should be marked as exported in the transactional database.
So far I've gotten my SSIS package to where I can iterate through the plant databases and read from one of the tables. I'm currently storing the resuls from that table into a variable. I accomplished the iteration part by using an expression in the For Each Loop Container's Connection Manager that sets the Initial Catalog to the current database name in the loop.
However, I'm not sure how to proceed after that. Here's a picture of my package's current state:
I've tried creating another Execute SQL Task that takes the results from Get New Apples and copies them to the master database. However, from what I've googled so far there doesn't seem to be an easy way to accomplish this.
A different approach I've tried is to create an OLE DB Source using the same connection manager as the For Each Loop Container. When I do that I get an error saying that the Apple table is not a valid object(My query being select * from Apple where exported = 0;).
Any suggestions as to how I can read a result set from a variable or get the OLE DB Source to work with the aforementioned Connection Manager would be very helpful.
I'm also open to alternate methods to accomplishing this. Like I said, I'm new to SSIS and am still feeling my way around it.
Originally I tried to make this as a stored procedure but it started to grow unmanagable and ugly very quickly:
SELECT *
INTO #tempapple
FROM (SELECT *
FROM [Plant1].[dbo].[Apple]
WHERE exported = 0
UNION
SELECT *
FROM [Plant2].[dbo].[Apple]
WHERE exported = 0) AS x;
INSERT INTO [Master].[dbo].[Apple]
SELECT id,
NAME,
description,
active,
plant
FROM #tempapple
WHERE id NOT IN (SELECT id
FROM [Master].[dbo].[Apple]);
UPDATE [Plant1].[dbo].[Apple]
SET exported = 1
WHERE id IN (SELECT id
FROM #tempapple);
UPDATE [Plant2].[dbo].[Apple]
SET exported = 1
WHERE id IN (SELECT id
FROM #tempapple);
DROP TABLE #tempapple;
I've got to make a few assumptions here:
The variable is type 'Object'
the foreach loop is on an ADO.Object enumerator setting the db name to a variable
insert an expression before the dataflow
in the expression set a new variable type string to "Select * from " + [dbname] + ".[schema].[tablename] where exported = 0"
4a. Note that dbname comes the enumerable set in #2
In your dataflow, set your source to variable and use that variable in #4.
This should get your data at least loaded.
You have options, for updated the isExported column in the source.
I'm writing this directly so you may need to modify it slightly.
declare #dbname as varchar(100) -- dbname
declare #SQL varchar(max)
declare db_cursor cursor for
[ this is where you insert your code for getting DBnames]
OPEN db_cursor
fetch next from db_cursor into #dbname
while ##fetch_status = 0
BEGIN
set #SQL = "Select * into #temptable from " + #dbname + ".[dbo].[Apple] where exported = 0
INSERT INTO [Master].[dbo].[Apple]
SELECT id,
NAME,
description,
active,
plant
FROM #tempapple
-- no where clause needed
UPDATE " + #dbname + ".[dbo].[Apple]
SET exported = 1
from " + #dbname + ".[dbo].[Apple] a
join #temptable tt on a.id=tt.id
DROP TABLE #tempapple; "
exec(#sql);
fetch next from db_cursor into #dbname
END
close db_cursor
deallocate db_cursor
I've decided to settle for a mix of my two approaches. The SSIS package remains mostly the same with the logic to iterate through each plant database. Within the loop I now have several Execute SQL Tasks to import data from the various tables. The logic for the import apples task looks something like this:
SELECT *
INTO #tempapple
FROM (SELECT *
FROM apple
WHERE exported = 0);
INSERT INTO [Master].[dbo].[apple]
SELECT id,
NAME,
description,
active,
plant
FROM #tempapple
WHERE id NOT IN (SELECT id
FROM [Master].[dbo].[apple]);
UPDATE apple
SET exported = 1
WHERE id IN (SELECT id
FROM #tempapple);
DROP TABLE #tempapple;
This allows me to not have reduntant SQL since each task will be executed once per plant database.

How could our database be causing SqlPackage to fail? (SQL72018)

Hoping someone else has come across this, because Google returns only nine results for this error! Information about SqlPackage seems a little scant still.
We're currently going through the process of migrating to a continuous deployment environment. As part of this, we're using database projects to store database schema and the build server is using SqlPackage.exe to generate an upgrade script by comparing each project's .dacpac file with its associated schema template database that's hosted on the server.
We have six databases so far (with more to come) and they all work fine apart from one, which throws the following error when SqlPackage is modelling the 'target' database:
Error SQL72018: Trigger could not be imported but one or more of these objects exist in your source.
The only thing we can think of is that it's a problem with the size of the target database; perhaps SqlPackage is running out of memory? It's the largest database schema we have, so it's certainly feasible. If it's down to a memory limitation of SqlPackage, how do we go about increasing it?
We're going to start removing objects from the target database and the source project to see if we can establish whether it's down to size or a specific schema object, but any ideas and suggestions in the meantime would be greatly appreciated!
Update
I just tried removing all triggers from the target database, and now it spits out the upgrade script with no errors. Next I'll try removing only half of them, and see if I can narrow it down to one specific trigger. I suspect it may simply be the size of the schema, which goes back to the SqlPackage memory question.
Okay .. worked it out.
We went through a process of dropping triggers until we narrowed it down to a single trigger that was causing the error. Turns out that there was something rather wrong with it - it seems that it's meant to be attached to a table, but isn't (i.e. it doesn't appear in the triggers list). So I'm guessing this minor corruption caused SqlPackage to fail.
In case anyone finds it useful, this is the script I used to drop ranges of triggers, which helped me find the culprit:
http://www.codeproject.com/Tips/662699/Drop-all-Triggers-belonging-to-any-schema-in-MS-SQ
Use ClaimsSqlPackageTest
DECLARE #SQLCmd nvarchar(1000)
DECLARE #Trig varchar(500)
DECLARE #sch varchar(500)
Declare #count int = 0
DECLARE TGCursor CURSOR FOR
SELECT ISNULL(tbl.name, vue.name) AS [schemaName]
, trg.name AS triggerName
FROM sys.triggers trg
LEFT OUTER JOIN (SELECT tparent.object_id, ts.name
FROM sys.tables tparent
INNER JOIN sys.schemas ts ON TS.schema_id = tparent.SCHEMA_ID)
AS tbl ON tbl.OBJECT_ID = trg.parent_id
LEFT OUTER JOIN (SELECT vparent.object_id, vs.name
FROM sys.views vparent
INNER JOIN sys.schemas vs ON vs.schema_id = vparent.SCHEMA_ID)
AS vue ON vue.OBJECT_ID = trg.parent_id
OPEN TGCursor
FETCH NEXT FROM TGCursor INTO #sch,#Trig
WHILE ##FETCH_STATUS = 0
BEGIN
SET #SQLCmd = N'DROP TRIGGER [' + #sch + '].[' + #Trig + ']'
If #count >= 155 AND #count <= 160 Begin
EXEC sp_executesql #SQLCmd
PRINT #SQLCmd
End
Set #count = #count + 1
FETCH next FROM TGCursor INTO #sch,#Trig
END
CLOSE TGCursor
DEALLOCATE TGCursor
Putting a trigger on a table that has the extended property 'microsoft_database_tools_support' is a simple way to produce this error. For example putting a trigger on the sysdiagrams table.
Apparently this extended property makes the object invisible to VS and the existence of a trigger on an invisible object is confusing. You can either make both objects visible by removing the microsoft_database_tools_support properties or make them both invisible by adding the property to the other object.
So in my example of putting a trigger named iut_sysdiagrams on sysdiagrams I'd use this:
EXEC sys.sp_addextendedproperty #level0type = N'SCHEMA' ,#level0name = [dbo]
,#level1type = N'TABLE' ,#level1name = [sysdiagrams]
,#level2type = N'TRIGGER' ,#level2name = [iut_sysdiagrams]
,#name = N'microsoft_database_tools_support' ,#value = 1;

Altering user-defined table types in SQL Server

How can I alter a user-defined table type in SQL Server ?
As of my knowledge it is impossible to alter/modify a table type.You
can create the type with a different name and then drop the old type
and modify it to the new name
Credits to jkrajes
As per msdn, it is like 'The user-defined table type definition cannot be modified after it is created'.
This is kind of a hack, but does seem to work. Below are the steps and an example of modifying a table type. One note is the sp_refreshsqlmodule will fail if the change you made to the table type is a breaking change to that object, typically a procedure.
Use sp_rename to rename the table type, I typically just add z to
the beginning of the name.
Create a new table type with the original name and any modification
you need to make to the table type.
Step through each dependency and run sp_refreshsqlmodule on it.
Drop the renamed table type.
EXEC sys.sp_rename 'dbo.MyTableType', 'zMyTableType';
GO
CREATE TYPE dbo.MyTableType AS TABLE(
Id INT NOT NULL,
Name VARCHAR(255) NOT NULL
);
GO
DECLARE #Name NVARCHAR(776);
DECLARE REF_CURSOR CURSOR FOR
SELECT referencing_schema_name + '.' + referencing_entity_name
FROM sys.dm_sql_referencing_entities('dbo.MyTableType', 'TYPE');
OPEN REF_CURSOR;
FETCH NEXT FROM REF_CURSOR INTO #Name;
WHILE (##FETCH_STATUS = 0)
BEGIN
EXEC sys.sp_refreshsqlmodule #name = #Name;
FETCH NEXT FROM REF_CURSOR INTO #Name;
END;
CLOSE REF_CURSOR;
DEALLOCATE REF_CURSOR;
GO
DROP TYPE dbo.zMyTableType;
GO
WARNING:
This can be destructive to your database, so you'll want to test this on a development environment first.
Here are simple steps that minimize tedium and don't require error-prone semi-automated scripts or pricey tools.
Keep in mind that you can generate DROP/CREATE statements for multiple objects from the Object Explorer Details window (when generated this way, DROP and CREATE scripts are grouped, which makes it easy to insert logic between Drop and Create actions):
Back up you database in case anything goes wrong!
Automatically generate the DROP/CREATE statements for all dependencies (or generate for all "Programmability" objects to eliminate the tedium of finding dependencies).
Between the DROP and CREATE [dependencies] statements (after all DROP, before all CREATE), insert generated DROP/CREATE [table type] statements, making the changes you need with CREATE TYPE.
Run the script, which drops all dependencies/UDTTs and then recreates [UDTTs with alterations]/dependencies.
If you have smaller projects where it might make sense to change the infrastructure architecture, consider eliminating user-defined table types. Entity Framework and similar tools allow you to move most, if not all, of your data logic to your code base where it's easier to maintain.
To generate the DROP/CREATE statements for multiple objects, you can right-click your Database > Tasks > Generate Scripts... (as shown in the screenshot below). Notice:
DROP statements are before CREATE statements
DROP statements are in dependency order (i.e. reverse of CREATE)
CREATE statements are in dependency order
Simon Zeinstra has found the solution!
But, I used Visual Studio community 2015 and I didn't even have to use schema compare.
Using SQL Server Object Explorer, I found my user-defined table type in the DB. I right-mouse clicked on the table-type and selected . This opened a code tab in the IDE with the TSQL code visible and editable. I simply changed the definition (in my case just increased the size of an nvarchar field) and clicked the Update Database button in the top-left of the tab.
Hey Presto! - a quick check in SSMS and the udtt definition has been modified.
Brilliant - thanks Simon.
If you can use a Database project in Visual Studio, you can make your changes in the project and use schema compare to synchronize the changes to your database.
This way, dropping and recreating the dependent objects is handled by the change script.
You should drop the old table type and create a new one. However if it has any dependencies (any stored procedures using it) you won't be able to drop it. I've posted another answer on how to automate the process of temporary dropping all stored procedures, modifying the table table and then restoring the stored procedures.
Just had to do this alter user defined table type in one of my projects. Here are the steps I employed:
Find all the SP using the user defined table type.
Save a create script for all the SP(s) found.
Drop the SP(s).
Save a create script for the user defined table you wish to alter.
4.5 Add the additional column or changes you need to the user defined table type.
Drop the user defined table type.
Run the create script for the user defined table type.
Run the create script for the SP(s).
Then start modifying the SP(s) accordingly.
you cant ALTER/MODIFY your TYPE. You have to drop the existing and re-create it with correct name/datatype or add a new column/s
I created two stored procedures for this. The first one
create_or_alter_udt_preprocess takes the udt name as input, drops all the stored procs/functions that use the udt, drops the udt, and return a sql script to recreate all the procedures/functions.
The second one
create_or_alter_udt_postprocess takes the script outputted from the first proc and executes it.
With the two procs, changing an udt can be done by:
call create_or_alter_udt_preprocess;
create the udt with a new definition;
call create_or_alter_udt_postprocess;
Use a transaction to avoid losing the original procs in case of errors.
create or ALTER proc create_or_alter_udt_postprocess(#udt_postprocess_data xml)
as
begin
if #udt_postprocess_data is null
return;
declare #obj_cursor cursor
set #obj_cursor = cursor fast_forward for
select n.c.value('.', 'nvarchar(max)') as definition
from #udt_postprocess_data.nodes('/Objects/definition') as n(c)
open #obj_cursor;
declare #definition nvarchar(max);
fetch next from #obj_cursor into #definition;
while (##fetch_status = 0)
begin
exec sp_executesql #stmt= #definition
fetch next from #obj_cursor into #definition
end
CLOSE #obj_cursor;
DEALLOCATE #obj_cursor;
end
Create or ALTER proc create_or_alter_udt_preprocess(#udt nvarchar(200), #udt_postprocess_data xml out)
AS
BEGIN
set #udt_postprocess_data = null;
if TYPE_ID(#udt) is null
return;
declare #drop_scripts nvarchar(max);
SELECT #drop_scripts = (
(select N';'+ drop_script
from
(
SELECT
drop_script = N'drop ' + case sys.objects.type when 'P' then N'proc ' else N'function' end
+ sys.objects.name + N';' + + nchar(10) + nchar(13)
FROM sys.sql_expression_dependencies d
JOIN sys.sql_modules m ON m.object_id = d.referencing_id
JOIN sys.objects ON sys.objects.object_id = m.object_id
WHERE referenced_id = TYPE_ID(#udt)
) dependencies
FOR XML PATH (''), type
).value('.', 'nvarchar(max)')
) ;
declare #postprocess_data xml;
set #udt_postprocess_data =
(SELECT
definition
FROM sys.sql_expression_dependencies d
JOIN sys.sql_modules m ON m.object_id = d.referencing_id
JOIN sys.objects ON sys.objects.object_id = m.object_id
WHERE referenced_id = TYPE_ID(#udt)
FOR XML PATH (''), root('Objects'));
exec sp_executesql #stmt= #drop_scripts;
exec sp_droptype #udt;
END
Example usage:
begin tran
declare #udt_postprocess_data xml;
exec create_or_alter_udt_preprocess #udt= 'test_list', #udt_postprocess_data = #udt_postprocess_data out;
CREATE TYPE test_list AS TABLE(
test_name nvarchar(50) NULL
);
exec create_or_alter_udt_postprocess #udt_postprocess_data = #udt_postprocess_data;
commit;
Code to set up the example usage:
CREATE TABLE [dbo].[test_table](
[test_id] [int] IDENTITY(1,1) NOT NULL, [test_name] [varchar](20) NULL
) ON [USERDATA]
GO
CREATE TYPE test_list AS TABLE(test_name nvarchar(20) NULL)
GO
create proc add_tests(
#test_list test_list readonly)
as
begin
SET NOCOUNT ON;
insert into test_table(test_name)
select test_name
from #test_list;
end;
create proc add_tests2(
#test_list test_list readonly)
as
begin
SET NOCOUNT ON;
insert into test_table(test_name)
select test_name
from #test_list;
end;

How to completely clean out a SQL Server 2005 database?

I made a copy of a DB that is used for a web app to make a new instance of this web app. I am wondering how do I remove all data and transactions and what not so that it is just a clean empty shell of tables ready to be written with new data?
Sql Server Database Publishing Wizard. Create a script with just the schema, specifying to drop the existing objects.
run this script:
select 'TRUNCATE TABLE ' + name from sysobjects where xtype='U'
and then paste the results into a new script and run that
(And for God's sake, be careful!) :)
EDIT
From comments it seems TRUNCATE can't delete rows from tables with foreign keys.
You could use
select 'DELETE FROM ' + name from sysobjects where xtype='U'
and you would also have to rearrange the output to delete from child tables first. Others have suggested scripting a clean database and that is probably a better idea TBH.
Uncomment out the -- to actually run... BE CAREFUL!!
Declare #t varchar (1024)
Declare tbl_cur cursor for
select TABLE_NAME from INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'
OPEN tbl_cur
FETCH NEXT from tbl_cur INTO #t
WHILE ##FETCH_STATUS = 0
BEGIN
--EXEC ('TRUNCATE TABLE '+ #t)
FETCH NEXT from tbl_cur INTO #t
END
CLOSE tbl_cur
DEALLOCATE tbl_Cur
EDIT:
In answer to the comment question... damn good question. I imagine you could find all the foreign keys and save them off
SELECT 'ALTER TABLE ' + b.TABLE_NAME + ' WITH CHECK ADD CONSTRAINT [' + a.CONSTRAINT_NAME + '] FOREIGN KEY '
+ c.COLUMN_NAME + ' REFERENCES [' + d.TABLE_NAME +'] ([' + e.COLUMN_NAME + '])'
FROM INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS a
INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS b
ON a.CONSTRAINT_NAME = b.CONSTRAINT_NAME
INNER JOIN INFORMATION_SCHEMA.TABLE_CONSTRAINTS d
ON a.UNIQUE_CONSTRAINT_NAME = d.CONSTRAINT_NAME
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE c
ON a.CONSTRAINT_NAME = c.CONSTRAINT_NAME
INNER JOIN (
SELECT
f.TABLE_NAME,
g.COLUMN_NAME
FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS f
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE g
ON f.CONSTRAINT_NAME = g.CONSTRAINT_NAME
WHERE f.CONSTRAINT_TYPE = 'PRIMARY KEY'
) e
ON e.TABLE_NAME = d.TABLE_NAME
ORDER BY a.CONSTRAINT_NAME
and then you could drop all of them (I dont believe it matters in which order you drop the constraints)
SELECT 'ALTER TABLE ' + col.TABLE_NAME + ' DROP CONSTRAINT ' + u.CONSTRAINT_NAME
FROM INFORMATION_SCHEMA.COLUMNS col
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE u
ON col.TABLE_NAME = u.TABLE_NAME
AND col.COLUMN_NAME = u.COLUMN_NAME
INNER JOIN INFORMATION_SCHEMA.table_constraints t
ON u.CONSTRAINT_NAME = t.CONSTRAINT_NAME
WHERE t.CONSTRAINT_TYPE = 'FOREIGN KEY'
and THEN use the first cursor to truncate all the tables. Then you can use the results of the script you saved off to recreate all of the FK relationships.
I don't know of any one step magical silver bullet command to do so, but if you want to preserve your tables/schemas, you'd probably need to script a truncate table for each.
Alternatively, you could script out the whole database and use that one script to regenerate a new database after you drop the "used" one. Making sense?
Right click on the Database you want to deal with, select Script Database As (3rd option from the top), then the option DROP and CREATE to ... at which point maybe you want to do this to a file or the clipboard and paste it somewhere.
Then, with this file handy as your script, run it to create a clean nice database.
You could create an empty database, and then use something like SQL Compare to compare your existing database against the empty one to generate scripts to recreate the database entirely from scratch.
You could also use the following SQL:
--// Switch to the database to be modified
USE DatabaseName;
--// The following commands need to be run for each table
--// You could perhaps automate this by using a cursor
--// First truncate the table and remove all data
TRUNCATE TABLE MyTable;
--// Also reset the identity seed
DBCC CHECKIDENT (MyTable, reseed, 1)
I'd recommend re-creating the database structure from scratch, rather than doing a backup-and-restore-to-new-database, as this will give you a completely clean database without any 'residue' (like stuff in the transaction log).
Truncating will work if you no foreign keys defined (And if you don't please please add them).
If your using SQL Server 2005 (08 might be the same), you can generate a script for the entire database, from within Sql Server Management Studio. Right click on the database you want to script.
Then go to tasks, and generate scripts. Script out all the objects, then you can use this script to build a fresh copy of the DB based on just the schema.
This article presents store-procedure without any of the mentioned problems.
The key is to disable referential integrity :)
You may want to consider just generating a t-sql script that only includes the structure from your existing database. The SQL Server Management Console makes this very easy, as you just need to right click on your original database, select 'tasks->generate scripts'. From there, just click through the defaults and select the objects that you want to duplicate (tables, etc).
This generates a nice T-SQL script that you can apply to any blank database, giving you the structure that you are looking for without the data. To me, this seems to be a more appropriate option as compared to truncation.
you would just truncate each table as in
use [dbname]
truncate table [table]
where [dbname] is the name of the copied database, and you would copy the 2nd line for each table in the database.
I'm sure with about 5-10 minutes, you could create a script that would read over all the available tables listed in the systables and use that information to do a while loop so you didn't have to write "truncate table [table]" for each table, but that's the general idea.
--
Ok,
To all that answered after me, I'm getting this mental "guilt" thing going on because I didn't write up that he should have created a TSQL script and re-create a database off of that.
There were several reasons why I didn't go that route.
You don't know what sorts of
"static" data he has in the
database.
He specifically asked
about how to clear the database.
I don't currently know what the #OP
has built into the rest of the
system. there could potentially be
dependencies that he needs a copy of
the original in order to satisfy a
condition.
Had the gentleman asked this in a different manner, I may have indeed answered like so many have and simply stated to script the database out.

MSSQL Database Cleanup - How do you find unused objects (Tables, Views, Procs, Functions)

Lets say you have inherited a MS SQL 2000 or 2005 database, and you know that some of the Tables, Views, Procs, and Functions are not actually used in the final product.
Is there some kind of internal logging or another mechanism that could tell me what objects are NOT being called? or have only been called a few times versus thousands of times.
This SO question, Identifying Unused Objects In Microsoft SQL Server 2005, might be relevant.
The answer will depend a little on how the database has been put together, but my approach to a similar problem was 3-fold:
Figure out which objects have no internal dependencies. You can work this out from queries against sysdepends such as:
select
id,
name
from
sys.sysdepends sd
inner join sys.sysobjects so
on so.id = sd.id
where
not exists (
select
1
from
sysdepends sd2
where
sd2.depid = so.id
)
You should combine this with collecting the type of object (sysobjects.xtype) as you'll only want to isolate the tables, functions, stored procs and views. Also ignore any procedures starting "sp_", unless people have been creating procedures with those names for your application!
Many of the returned procedures may be your application's entry points. That is to say the procedures that are called from your application layer or from some other remote call and don't have any objects that depend on them within the database.
Assuming the process won't be too invasive (it will create some additional load, though not too much) you can now switch on some profiling of the SP:Starting, SQL:BatchStarting and / or SP:StmtStarting events. Run this for as long as you see fit, ideally logging into a sql table for easy cross referencing. You should be able to eliminate many of the procedures that are called directly from your application.
By cross referencing the text data from this log and your dependent object list you will hopefully have isolated most of the unused procedures.
Finally, you may want to take your candidate list resulting from this process and grep your sourcecode base against them. This is a cumbersome task and just because you find references in your code doesn't mean you need them! It may simply be that the code hasn't been removed though it's now logically inaccessible.
This is far from a perfect process. A relatively clean alternative is to set up far more detailed (and therefore invasive) profiling on the server to monitor all the activity. This can include every SQL statement called during the time the log is active. You can then work back through the dependent tables or even cross-database dependencies from this text data. I've found the reliability of the log detail (too many rows per second attempting to be parsed) and the sheer quanitity of data difficult to deal with. If your application is less likely to suffer from this then it may be a good approach.
Caveat:
Because, so far as I'm aware, there isn't a perfect answer to this be particularly wary of removing tables. Procedures, functions and views are easily replaced if something goes wrong (though make sure you have them in source control before burning them of course!). If you're feeling really nervous why not rename the table and create a view with the old name, you've then got an easy out.
We can also find unused columns and table using following query. I tired to write cursor. Cursor will give you information aboout each column n each table.
declare #name varchar(200), #id bigint, #columnname varchar(500)
declare #temptable table
(
table_name varchar(500),
Status bit
)
declare #temp_column_name table
(
table_name varchar(500),
column_name varchar(500),
Status bit
)
declare find_table_dependency cursor for
select name, id from sysobjects where xtype ='U'
open find_table_dependency
fetch find_table_dependency into #name, #id
while ##fetch_Status = 0
begin
if exists(select top 1 name from sysobjects where id in
(select id from syscomments where text like '%'+#name +'%'))
insert into #temptable
select #name, 1
else
insert into #temptable
select #name, 0
declare find_column_dependency cursor for
select name from syscolumns where id = #id
open find_column_dependency
fetch find_column_dependency into #columnname
while ##fetch_Status = 0
begin
if exists(select top 1 name from sysobjects where id in
(select id from syscomments where text like '%'+#columnname +'%'))
insert into #temp_column_name
select #name,#columnname, 1
else
insert into #temp_column_name
select #name,#columnname, 0
fetch find_column_dependency into #columnname
end
close find_column_dependency
deallocate find_column_dependency
fetch find_table_dependency into #name, #id
end
close find_table_dependency
deallocate find_table_dependency
select * from #temptable
select * from #temp_column_name

Resources