Is there a way in SQL Server to uniquely identify a database? - sql-server

Is there any way to uniquely identify a database?
If we were to copy a database to another machine, this instance is assumed to be different. I checked on master tables, but could not identify any information that can identify this.

service_broker_guid in sys.databases comes pretty close to what you ask. It is a uniqueidentfier generated when the database is created and is preserved as the database is moved around (detach and attach, backup and restored, server rename etc). It can be explicitly changed with ALTER DATABASE ... SET NEW_BROKER;.

You could make a table in it with a unique name, and simply do a query on that. It's a bit of a hack, sure, but it'd work...

You could put the information in an extended property associated with the database itself:
USE AdventureWorks2008R2;
GO
EXEC sys.sp_addextendedproperty
#name = N'MS_DescriptionExample',
#value = N'AdventureWorks2008R2 Sample OLTP Database';
GO
http://msdn.microsoft.com/en-us/library/ms190243.aspx
In your case, I would use something like this:
EXEC sys.sp_addextendedproperty
#name = N'UniqueID',
#value = N'10156435463';
select objname, [name], [value]
from fn_listextendedproperty (null, null, null, null, null, null, null)

Create a scalar function that returns an ID/Version number:
create function fnGetThisDBID() returns varchar(32) as begin
return ('v1.1,origin=server1')
end
select 'version is: ' + dbo.fnGetThisDBID()

Related

SQL Server 2019 infinitely recompiles inline table valued function that does full text search with Parameterization = Forced

I have hit a problem with SQL Server that results in it infinitely recompiling a function.
To reproduce, create a new database with the option Parameterization = Forced or execute the following on an existing DB:
ALTER DATABASE [DatabaseName] SET PARAMETERIZATION FORCED WITH NO_WAIT
Then execute the following script:
CREATE TABLE dbo.TestTable(
ID int IDENTITY(1,1) NOT NULL,
FullTextField varchar(100) NULL,
CONSTRAINT PK_TestTable PRIMARY KEY CLUSTERED
(ID ASC)
)
GO
IF NOT EXISTS(SELECT 1 FROM sysfulltextcatalogs WHERE name = 'FullTextCat')
CREATE FULLTEXT CATALOG FullTextCat;
GO
CREATE FULLTEXT INDEX ON dbo.TestTable (FullTextField) KEY INDEX PK_TestTable
ON FullTextCat
WITH
CHANGE_TRACKING AUTO
GO
CREATE OR ALTER FUNCTION dbo.fn_TestFullTextSearch(#Filter VARCHAR(8000))
RETURNS TABLE
AS
RETURN SELECT
ID,
FullTextField
FROM dbo.TestTable
WHERE CONTAINS(FullTextField, #Filter)
GO
SELECT * FROM dbo.fn_TestFullTextSearch('"a*"')
The query will never return. Running SQL Profiler to monitor SP:CacheInsert and SP:CacheRemove will show SQL server is doing this endlessly and the SQL logs will show countless "A possible infinite recompile was detected for SQLHANDLE" messages.
Setting the Parameterization = Simple works around the issue but we need this to be set to Forced for other reasons.
Has anyone come across this issue before and/or have a suggested solution?
Thanks,
Chuck
While I still experience the problem with the original code I provided, by following #Martin's approach of explicitly parameterizing the call to the function:
EXEC sys.sp_executesql N'SELECT * FROM dbo.fn_TestFullTextSearch(#Filter)', N'#Filter VARCHAR(4)', #Filter = '"a*"'
I have been able to successfully work around the problem.

SQL Server 2014 sp_msforeachdb gives different results than running against individual database

My office is changing our linked servers. As a result, I need to get a list of every single view from every database on our instance that points to the current linked server so we can know what needs replaced.
After doing some research online, I came up with this solution to get a list of all the views that reference the linked server:
Create a temp table:
CREATE TABLE [dbo].[#TMP]
(
[DBNAME] NVARCHAR(256) NULL,
[NAME] NVARCHAR(256) NOT NULL,
[DESC] NVARCHAR(MAX) NOT NULL
);
Then, I can take advantage of the sp_msforeachdb procedure to iterate through each database, and store this information in the temporary table:
DECLARE #command varchar(1000)
SELECT #command = 'INSERT INTO #TMP SELECT ''?'' as DBName, OBJECT_NAME(object_id), definition FROM sys.sql_modules WHERE definition LIKE ''%linkedservername%'''
EXEC sp_msforeachdb #command
When I do a SELECT * from #TMP, I see something fishy... the same 5 views are repeated for EVERY database. It's as if it took the first 5 views in a database that had by linked server name, and then just copied it for every database!
Things get even weirder if I modify my select command by changing sys.sql_modules to [?].sys.sql_modules; in this case, rather than getting 565 results, I only get 17!!!
Now, if I take out the INSERT INTO #TMP" part of the command, and run the following:
DECLARE #command varchar(1000)
SELECT #command = 'SELECT ''?'' as DBName, OBJECT_NAME(object_id), definition FROM sys.sql_modules WHERE definition LIKE ''%linkedservername%'''
EXEC sp_msforeachdb #command
The results get even weirder! In one of my databases named "DB_Jobs", in the column for views (there isn't a column name), 3 of the 4 results returns NULL, and the last results returns "SCHTYPEMismatch". Stranger yet, in the definition column, it returns accurate results!!!
Then, if I go to the database and run this:
SELECT OBJECT_NAME(object_id), definition
FROM [DB_Jobs].[sys].[sql_modules]
WHERE definition LIKE '%linkedservername%'
it returns the results perfectly!
What's going on? More importantly, what can I do in my original #command to utilize sp_msforeachdb and correctly return the results I want (and include the database name for each result)?
By the way, I'm on SQL Server 2014.
Sp_msforeachdb is basically a global cursor that gives you access to the each database in turn by referencing [?]. It doesn't execute your command on each db by default. You can see this if you run a simple
EXEC sp_msforeachdb 'select db_name()'
For your first example, you're getting the same views because you're running the command against the same database every time. When you switch to [?].sys.sql_modules you start querying the sys.sql_modules in that database referenced by [?].
The problem with NULLs can be seen by running something like this:
SELECT OBJECT_NAME(object_id), definition FROM [msdb].[sys].[sql_modules] WHERE definition LIKE '%name%'
Run it in MSDB and you'll have a column name full of object names and a column with definitions. Run it in Master and the object names are now NULL even though you have the definitions. OBJECT_NAME() runs in the context of the current database, so you get NULLs unless you happen to have an object_id that matches, but then you're displaying the wrong object name. Definitions is directly referencing the database you want, so you still get them.
To get your query to work as you want it you just need to USE [?] (I'm looking for a definition like %name% because I know it will be there for testing)
CREATE TABLE [dbo].[#TMP](
[DBNAME] NVARCHAR(256) NULL,
[NAME] NVARCHAR(256) NOT NULL,
[DESC] NVARCHAR(MAX) NOT NULL);
DECLARE #command varchar(1000)
SELECT #command = 'USE [?]; INSERT INTO #TMP SELECT ''?'' as DBName, OBJECT_NAME(object_id), definition FROM sys.sql_modules WHERE definition LIKE ''%name%'''
EXEC sp_msforeachdb #command
SELECT * FROM #TMP

Make default value in field equal to the database name in SQL server 2014

Updated
I would like to create a database table that contains the database name for every record that gets created and concatenates it with an auto-incrementing number. Please see below what I am trying to do:
CREATE DATABASE TEST_1234_5678
GO
USE TEST_1234_5678
GO
CREATE TABLE TBL_ANALYSIS
(ID INT NOT NULL IDENTITY(1,1),
DATABASE_NAME VARCHAR(14) DEFAULT DB_NAME()
DESIRED_ID VARCHAR(20) DEFAULT DATABASE_NAME + CAST(ID AS VARCHAR)
);
I am unable to assign DESIRED_ID.
First of all, running CREATE DATABASE does not switch you to that context. If, for instance, I'm currently connected to My_DB and run your CREATE DATABASE command, the DB might be created, but I'll still be working on My_DB.
I only point that out because your question doesn't show that you're switching DB context, and that might be relevant, depending on how your server is set up.
CREATE DATABASE TEST_1234_5678
GO
USE TEST_1234_5678
GO
CREATE TABLE TBL_ANALYSIS
(ID INT NOT NULL IDENTITY(1,1),
DATABASE_NAME VARCHAR(14) DEFAULT DB_NAME()
)
GO
INSERT INTO TBL_ANALYSIS DEFAULT VALUES
GO
Depending on the error you're getting, this could be a number of different things. Assuming that you're in the right context and connected as a user with permissions, the error message you're getting could be extremely important to solve the issue. I would recommend ensuring that you're switching contexts, and if it still isn't working for you you might considering editing your question and posting the actual error message.
EDIT: After getting some more information from you, I understand that the issue is in trying to create a column with a default value
based on other columns in the table. This isn't something that is
supported in SQL Server, but you can use a computed column to get the
same information. Since this isn't a complex operation, you could do
something as simple as this:
CREATE DATABASE TEST_1234_5678
GO
USE TEST_1234_5678
GO
CREATE TABLE TBL_ANALYSIS
(
ID INT NOT NULL IDENTITY(1,1)
,DATABASE_NAME VARCHAR(14) DEFAULT DB_NAME()
,DESIRED_ID AS DATABASE_NAME + CAST(ID AS VARCHAR(14))
)
GO
INSERT INTO TBL_ANALYSIS DEFAULT VALUES
GO
SELECT * FROM TBL_ANALYSIS
GO

Use the result of a system stored procedure as a queryable table

Note: the highest linked question does not solve the problem for system stored procedures, but it's close. With help of the commenters, I came to a working answer.
Trying to use statements such as the following for sp_spaceused, throws an error
SELECT * INTO #tblOutput exec sp_spaceused 'Account'
SELECT * FROM #tblOutput
The errors:
Must specify table to select from.
and:
An object or column name is missing or empty. For SELECT INTO statements, verify each column has a name. For other statements, look for empty alias names. Aliases defined as "" or [] are not allowed. Change the alias to a valid name.
When I fully declare a table variable, it works as expected, so it seems to me that the stored procedure does return an actual table.
CREATE TABLE #tblOutput (
name NVARCHAR(128) NOT NULL,
rows CHAR(11) NOT NULL,
reserved VARCHAR(18) NOT NULL,
data VARCHAR(18) NOT NULL,
index_size VARCHAR(18) NOT NULL,
unused VARCHAR(18) NOT NULL)
INSERT INTO #tblOutput exec sp_spaceused 'Response'
SELECT * FROM #tblOutput
Why is it not possible to use a temp table or table variable with the result set of EXECUTE sp_xxx? Or: does a more compact expression exist than having to predefine the full table each time?
(incidentally, and off-topic, Googling for the exact term SELECT * INTO #tmp exec sp_spaceused at the time of writing, returned exactly one result)
TL;DR: use SET FMTONLY OFF with OPENQUERY, details below.
It appears that the link provided by Daniel E. is only part of the solution. For instance, if you try:
-- no need to use sp_addlinkedserver
-- must fully specify sp_, because default db is master
SELECT * FROM OPENQUERY(
[SERVERNAME\SQL2008],
'exec somedb.dbo.sp_spaceused ''Account''')
you will receive the following error:
The OLE DB provider "SQLNCLI10" for linked server "LOCALSERVER\SQL2008" supplied inconsistent metadata for a column. The name was changed at execution time.
I found the solution through this post, and then a blog-post on OPENQUERY, which in turn told me that until SQL2008, you need to use SET FMTONLY OFF. The final solution, which is essentially surprisingly simple (and easier to accomplish since there is no need to specify a loopback linked server), is this:
SELECT * FROM OPENQUERY(
[SERVERNAME\SQL2008],
'SET FMTONLY OFF
EXEC somedb.dbo.sp_spaceused ''Account''')
In addition, if you haven't set DATA-ACCESS, you may get the following error:
Server 'SERVERNAME\SQL2008' is not configured for DATA ACCESS.
This can be remedied by running the following command:
EXEC sp_serveroption 'SERVERNAME\SQL2008', 'DATA ACCESS', TRUE
We cannot SELECT from a stored procedure thats why SELECT * INTO ..Exec sp_ will not work.
To get the result set returned from a store procedure we can INSERT INTO a table.
SELECT INTO statement creates a table on fly and inserts data from the source table/View/Function. The only condition is source table should exist and you should be able to Select from it.
Sql Server doesn't allow you to use SELECT from sp_ therefore you can only use the INSERT INTO statement when executing a stored procedure this means at run time you can add the returned result set into a table and Select from that table at later stage.
INSERT INTO statement requires the destination table name, An existing table. Therefore whether you use a Temp Table, Table variable or Sql server persistent table you will need to create the table first and only they you can use the syntax
INSERT INTO #TempTable
EXECUTE sp_Proc
Using [YOUR DATABASE NAME]
CREATE TABLE [YOURTABLENAME]
(Database_Name Varchar(128),
DataBase_Size VarChar(128),
unallocated_Space Varchar(128),
reserved Varchar(128),
data Varchar(128),
index_size Varchar(128),
unused Varchar(128)
);
INSERT INTO dbo.[YOUR TABLE NAME]
(
Database_Name,
DataBase_Size,
unallocated_Space,
reserved,
data,
index_size,
unused
)
EXEC sp_spaceused #oneresultset = 1
--To get it to return it all as one data set add the nonresultset=1 at the end and viola good to go for writing to a table. :)

Do you use source control for your database items? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I feel that my shop has a hole because we don't have a solid process in place for versioning our database schema changes. We do a lot of backups so we're more or less covered, but it's bad practice to rely on your last line of defense in this way.
Surprisingly, this seems to be a common thread. Many shops I have spoken to ignore this issue because their databases don't change often, and they basically just try to be meticulous.
However, I know how that story goes. It's only a matter of time before things line up just wrong and something goes missing.
Are there any best practices for this? What are some strategies that have worked for you?
Must read Get your database under version control. Check the series of posts by K. Scott Allen.
When it comes to version control, the database is often a second or even third-class citizen. From what I've seen, teams that would never think of writing code without version control in a million years-- and rightly so-- can somehow be completely oblivious to the need for version control around the critical databases their applications rely on. I don't know how you can call yourself a software engineer and maintain a straight face when your database isn't under exactly the same rigorous level of source control as the rest of your code. Don't let this happen to you. Get your database under version control.
The databases themselves? No
The scripts that create them, including static data inserts, stored procedures and the like; of course. They're text files, they are included in the project and are checked in and out like everything else.
Of course in an ideal world your database management tool would do this; but you just have to be disciplined about it.
I absolutely love Rails ActiveRecord migrations. It abstracts the DML to ruby script which can then be easily version'd in your source repository.
However, with a bit of work, you could do the same thing. Any DDL changes (ALTER TABLE, etc.) can be stored in text files. Keep a numbering system (or a date stamp) for the file names, and apply them in sequence.
Rails also has a 'version' table in the DB that keeps track of the last applied migration. You can do the same easily.
Check out LiquiBase for managing database changes using source control.
You should never just log in and start entering "ALTER TABLE" commands to change a production database. The project I'm on has database on every customer site, and so every change to the database is made in two places, a dump file that is used to create a new database on a new customer site, and an update file that is run on every update which checks your current database version number against the highest number in the file, and updates your database in place. So for instance, the last couple of updates:
if [ $VERSION \< '8.0.108' ] ; then
psql -U cosuser $dbName << EOF8.0.108
BEGIN TRANSACTION;
--
-- Remove foreign key that shouldn't have been there.
-- PCR:35665
--
ALTER TABLE migratorjobitems
DROP CONSTRAINT migratorjobitems_destcmaid_fkey;
--
-- Increment the version
UPDATE sys_info
SET value = '8.0.108'
WHERE key = 'DB VERSION';
END TRANSACTION;
EOF8.0.108
fi
if [ $VERSION \< '8.0.109' ] ; then
psql -U cosuser $dbName << EOF8.0.109
BEGIN TRANSACTION;
--
-- I missed a couple of cases when I changed the legacy playlist
-- from reporting showplaylistidnum to playlistidnum
--
ALTER TABLE featureidrequestkdcs
DROP CONSTRAINT featureidrequestkdcs_cosfeatureid_fkey;
ALTER TABLE featureidrequestkdcs
ADD CONSTRAINT featureidrequestkdcs_cosfeatureid_fkey
FOREIGN KEY (cosfeatureid)
REFERENCES playlist(playlistidnum)
ON DELETE CASCADE;
--
ALTER TABLE ticket_system_ids
DROP CONSTRAINT ticket_system_ids_showplaylistidnum_fkey;
ALTER TABLE ticket_system_ids
RENAME showplaylistidnum
TO playlistidnum;
ALTER TABLE ticket_system_ids
ADD CONSTRAINT ticket_system_ids_playlistidnum_fkey
FOREIGN KEY (playlistidnum)
REFERENCES playlist(playlistidnum)
ON DELETE CASCADE;
--
-- Increment the version
UPDATE sys_info
SET value = '8.0.109'
WHERE key = 'DB VERSION';
END TRANSACTION;
EOF8.0.109
fi
I'm sure there is a better way to do this, but it's worked for me so far.
Yes. Code is code. My rule of thumb is that I need to be able to build and deploy the application from scratch, without looking at a development or production machine.
The best practice I have seen is creating a build script to scrap and rebuild your database on a staging server. Each iteration was given a folder for database changes, all changes were scripted with "Drop... Create" 's . This way you can rollback to an earlier version at any time by pointing the build to folder you want to version to.
I believe this was done with NaNt/CruiseControl.
YES, I think it is important to version your database. Not the data, but the schema for certain.
In Ruby On Rails, this is handled by the framework with "migrations". Any time you alter the db, you make a script that applies the changes and check it into source control.
My shop liked that idea so much that we added the functionality to our Java-based build using shell scripts and Ant. We integrated the process into our deployment routine. It would be fairly easy to write scripts to do the same thing in other frameworks that don't support DB versioning out-of-the-box.
The new Database projects in Visual Studio provide source control and change scripts.
They have a nice tool that compares databases and can generate a script that converts the schema of one into the other, or updates the data in one to match the other.
The db schema is "shredded" to create many, many small .sql files, one per DDL command that describes the DB.
+tom
Additional info 2008-11-30
I have been using it as a developer for the past year and really like it. It makes it easy to compare my dev work to production and generate a script to use for the release. I don't know if it is missing features that DBAs need for "enterprise-type" projects.
Because the schema is "shredded" into sql files the source control works fine.
One gotcha is that you need to have a different mindset when you use a db project. The tool has a "db project" in VS, which is just the sql, plus an automatically generated local database which has the schema and some other admin data -- but none of your application data, plus your local dev db that you use for app data dev work. You rarely are aware of the automatically generated db, but you have to know its there so you can leave it alone :). This special db is clearly recognizable because it has a Guid in its name,
The VS DB Project does a nice job of integrating db changes that other team members have made into your local project/associated db. but you need to take the extra step to compare the project schema with your local dev db schema and apply the mods. It makes sense, but it seems awkward at first.
DB Projects are a very powerful tool. They not only generate scripts but can apply them immediately. Be sure not to destroy your production db with it. ;)
I really like the VS DB projects and I expect to use this tool for all my db projects going forward.
+tom
Requiring the development teams to use an SQL database source control management system isn’t the magic bullet which will prevent issues from happening. On its own, database source control introduces additional overhead as the developers are required to save the changes they’ve made to an object in a separate SQL script, open the source control system client, check in the SQL script file using the client and then apply the changes to the live database.
I can suggest using the SSMS add-in called ApexSQL Source Control. It allows developers to easily map database objects with the source control system via the wizard directly from SSMS. The add-in includes support for TFS, Git, Subversion and other SC systems. It also includes support for source controlling Static data.
After downloading and installing ApexSQL Source Control, simply right-click the database you want to version control and navigate to ApexSQL Source Control sub-menu in SSMS. Click the Link database to source control option, select the source control system and the development model. After that you’ll need to provide the log-in information and the repository string for the source control system you’ve chosen.
You can read this article for more information: http://solutioncenter.apexsql.com/sql-source-control-reduce-database-development-time/
I do by saving create/update scripts and a script that generates sampledata.
Yes, we do it by keeping our SQL as part of our build -- we keep DROP.sql, CREATE.sql, USERS.sql, VALUES.sql and version control these, so we can revert back to any tagged version.
We also have ant tasks which can recreate the db whenever needed.
Plus, the SQL is then tagged along with your source code that goes with it.
The most successful scheme I've ever used on a project has combined backups and differential SQL files. Basically we would take a backup of our db after every release and do an SQL dump so that we could create a blank schema from scratch if we needed to as well. Then anytime you needed to make a change to the DB you would add an alter scrip to the sql directory under version control. We would always prefix a sequence number or date to the file name so the first change would be something like 01_add_created_on_column.sql, and the next script would be 02_added_customers_index. Our CI machine would check for these and run them sequentially on a fresh copy of the db that had been restored from the backup.
We also had some scripts in place that devs could use to re-initialize their local db to the current version with a single command.
We do source control all our dabase created objects. And just to keep developers honest (because you can create objects without them being in Source Control), our dbas periodically look for anything not in source control and if they find anything, they drop it without asking if it is ok.
I use SchemaBank to version control all my database schema changes:
from day 1, I import my db schema dump into it
i started to change my schema design using a web browser (because they are SaaS / cloud-based)
when i want to update my db server, i generate the change (SQL) script from it and apply to the db. In Schemabank, they mandate me to commit my work as a version before I can generate an update script. I like this kind of practice so that I can always trace back when I need to.
Our team rule is NEVER touch the db server directly without storing the design work first. But it happens, somebody might be tempted to break the rule, in sake of convenient. We would import the schema dump again into schemabank and let it do the diff and bash someone if a discrepancy is found. Although we could generate the alter scripts from it to make our db and schema design in sync, we just hate that.
By the way, they also let us create branches within the version control tree so that I can maintain one for staging and one for production. And one for coding sandbox.
A pretty neat web-based schema design tool with version control n change management.
I have everything necessary to recreate my DB from bare metal, minus the data itself. I'm sure there are lots of ways to do it, but all my scripts and such are stored off in subversion and we can rebuild the DB structure and such by pulling all that out of subversion and running an installer.
I typically build an SQL script for every change I make, and another to revert those changes, and keep those scripts under version control.
Then we have a means to create a new up-to-date database on demand, and can easily move between revisions. Every time we do a release, we lump the scripts together (takes a bit of manual work, but it's rarely actually hard) so we also have a set of scripts that can convert between versions.
Yes, before you say it, this is very similar to the stuff Rails and others do, but it seems to work pretty well, so I have no problems admitting that I shamelessly lifted the idea :)
I use SQL CREATE scripts exported from MySQL Workbech, then using theirs "Export SQL ALTER" functionality I end up with a series of create scripts(numbered of course) and the alter scripts that can apply the changes between them.
3.- Export SQL ALTER script
Normally you would have to write the ALTER TABLE statements by hand now, reflecting your changes you made to the model. But you can be smart and let Workbench do the hard work for you. Simply select File -> Export -> Forward Engineer SQL ALTER Script… from the main menu.
This will prompt you to specify the SQL CREATE file the current model should be compared to.
Select the SQL CREATE script from step 1. The tool will then generate the ALTER TABLE script for you and you can execute this script against your database to bring it up to date.
You can do this using the MySQL Query Browser or the mysql client.Voila! Your model and database have now been synchronized!
Source: MySQL Workbench Community Edition: Guide to Schema Synchronization
All this scripts of course are inside under version control.
Yes, always. You should be able to recreate your production database structure with a useful set of sample data whenever needed. If you don't, over time minor changes to keep things running get forgotten then one day you get bitten, big time. Its insurance that you might not think you need but the day you do it it worth the price 10 times over!
There has been a lot of discussion about the database model itself, but we also keep the required data in .SQL files.
For example, in order to be useful your application might need this in the install:
INSERT INTO Currency (CurrencyCode, CurrencyName)
VALUES ('AUD', 'Australian Dollars');
INSERT INTO Currency (CurrencyCode, CurrencyName)
VALUES ('USD', 'US Dollars');
We would have a file called currency.sql under subversion. As a manual step in the build process, we compare the previous currency.sql to the latest one and write an upgrade script.
We version and source control everything surrounding our databases:
DDL (create and alters)
DML (reference data, codes, etc.)
Data Model changes (using ERwin or ER/Studio)
Database configuration changes (permissions, security objects, general config changes)
We do all this with automated jobs using Change Manager and some custom scripts. We have Change Manager monitoring these changes and notifying when they are done.
I believe that every DB should be under source control, and developers should have an easy way to create their local database from scratch. Inspired by Visual Studio for Database Professionals, I've created an open-source tool that scripts MS SQL databases, and provides and easy way of deploying them to your local DB engine. Try http://dbsourcetools.codeplex.com/ . Have fun,
- Nathan.
I source control the database schema by scripting out all objects (table definitions, indexes, stored procedures, etc.). But, as for the data itself, simply rely on regular backups. This ensures that all structural changes are captured with proper revision history, but doesn't burden the database each time data changes.
At our business we use database change scripts. When a script is run, it's name is stored in the database and won't run again, unless that row is removed. Scripts are named based on date, time and code branch, so controlled execution is possible.
Lots and lots of testing is done before the scripts are run in the live environment, so "oopsies" only happen, generally speaking, on development databases.
We're in the process of moving all the databases to source control. We're using sqlcompare to script out the database (a profession edition feature, unfortunately) and putting that result into SVN.
The success of your implementation will depend a lot on the culture and practices of your organization. People here believe in creating a database per application. There is a common set of databases that are used by most applications as well causing a lot of interdatabase dependencies (some of them are circular). Putting the database schemas into source control has been notoriously difficult because of the interdatabase dependencies that our systems have.
Best of luck to you, the sooner you try it out the sooner you'll have your issues sorted out.
I have used the dbdeploy tool from ThoughtWorks at http://dbdeploy.com/. It encourages the use of migration scripts. Each release, we consolidated the change scripts into a single file to ease understanding and to allow DBAs to 'bless' the changes.
This has always been a big annoyance for me too - it seems like it is just way too easy to make a quick change to your development database, save it (forgetting to save a change script), and then you're stuck. You could undo what you just did and redo it to create the change script, or write it from scratch if you want of course too, though that's a lot of time spent writing scripts.
A tool that I have used in the past that has helped with this some is SQL Delta. It will show you the differences between two databases (SQL server/Oracle I believe) and generate all the change scripts necessary to migrate A->B. Another nice thing it does is show all the differences between database content between the production (or test) DB and your development DB. Since more and more apps store configuration and state that is crucial to their execution in database tables, it can be a real pain to have change scripts that remove, add, and alter the proper rows. SQL Delta shows the rows in the database just like they would look in a Diff tool - changed, added, deleted.
An excellent tool. Here is the link:
http://www.sqldelta.com/
RedGate is great, we generate new snapshots when database changes are made (a tiny binary file) and keep that file in the projects as a resource. Whenever we need to update the database, we use RedGate's toolkit to update the database, as well as being able to create new databases from empty ones.
RedGate also makes Data snapshots, while I haven't personally worked with them, they are just as robust.
FYI This was also brought up a few days ago by Dana ... Stored procedures/DB schema in source control
Here is a sample poor man's solution for a trigger implementing tracking of changes on db objects ( via DDL stateements ) on a sql server 2005 / 2008 database. I contains also a simple sample of how-to enforce the usage of required someValue xml tag in the source code for each sql command ran on the database + the tracking of the current db version and type ( dev , test , qa , fb , prod)
One could extend it with additional required attributes such as , etc.
The code is rather long - it creates the empty database + the needed tracking table structure + required db functions and the populating trigger all running under a [ga] schema.
USE [master]
GO
/****** Object: Database [DBGA_DEV] Script Date: 04/22/2009 13:22:01 ******/
CREATE DATABASE [DBGA_DEV] ON PRIMARY
( NAME = N'DBGA_DEV', FILENAME = N'D:\GENAPP\DATA\DBFILES\DBGA_DEV.mdf' , SIZE = 3072KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
LOG ON
( NAME = N'DBGA_DEV_log', FILENAME = N'D:\GENAPP\DATA\DBFILES\DBGA_DEV_log.ldf' , SIZE = 6208KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
GO
ALTER DATABASE [DBGA_DEV] SET COMPATIBILITY_LEVEL = 100
GO
IF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled'))
begin
EXEC [DBGA_DEV].[dbo].[sp_fulltext_database] #action = 'enable'
end
GO
ALTER DATABASE [DBGA_DEV] SET ANSI_NULL_DEFAULT OFF
GO
ALTER DATABASE [DBGA_DEV] SET ANSI_NULLS OFF
GO
ALTER DATABASE [DBGA_DEV] SET ANSI_PADDING ON
GO
ALTER DATABASE [DBGA_DEV] SET ANSI_WARNINGS OFF
GO
ALTER DATABASE [DBGA_DEV] SET ARITHABORT OFF
GO
ALTER DATABASE [DBGA_DEV] SET AUTO_CLOSE OFF
GO
ALTER DATABASE [DBGA_DEV] SET AUTO_CREATE_STATISTICS ON
GO
ALTER DATABASE [DBGA_DEV] SET AUTO_SHRINK OFF
GO
ALTER DATABASE [DBGA_DEV] SET AUTO_UPDATE_STATISTICS ON
GO
ALTER DATABASE [DBGA_DEV] SET CURSOR_CLOSE_ON_COMMIT OFF
GO
ALTER DATABASE [DBGA_DEV] SET CURSOR_DEFAULT GLOBAL
GO
ALTER DATABASE [DBGA_DEV] SET CONCAT_NULL_YIELDS_NULL OFF
GO
ALTER DATABASE [DBGA_DEV] SET NUMERIC_ROUNDABORT OFF
GO
ALTER DATABASE [DBGA_DEV] SET QUOTED_IDENTIFIER OFF
GO
ALTER DATABASE [DBGA_DEV] SET RECURSIVE_TRIGGERS OFF
GO
ALTER DATABASE [DBGA_DEV] SET DISABLE_BROKER
GO
ALTER DATABASE [DBGA_DEV] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
GO
ALTER DATABASE [DBGA_DEV] SET DATE_CORRELATION_OPTIMIZATION OFF
GO
ALTER DATABASE [DBGA_DEV] SET TRUSTWORTHY OFF
GO
ALTER DATABASE [DBGA_DEV] SET ALLOW_SNAPSHOT_ISOLATION OFF
GO
ALTER DATABASE [DBGA_DEV] SET PARAMETERIZATION SIMPLE
GO
ALTER DATABASE [DBGA_DEV] SET READ_COMMITTED_SNAPSHOT OFF
GO
ALTER DATABASE [DBGA_DEV] SET HONOR_BROKER_PRIORITY OFF
GO
ALTER DATABASE [DBGA_DEV] SET READ_WRITE
GO
ALTER DATABASE [DBGA_DEV] SET RECOVERY FULL
GO
ALTER DATABASE [DBGA_DEV] SET MULTI_USER
GO
ALTER DATABASE [DBGA_DEV] SET PAGE_VERIFY CHECKSUM
GO
ALTER DATABASE [DBGA_DEV] SET DB_CHAINING OFF
GO
EXEC [DBGA_DEV].sys.sp_addextendedproperty #name=N'DbType', #value=N'DEV'
GO
EXEC [DBGA_DEV].sys.sp_addextendedproperty #name=N'DbVersion', #value=N'0.0.1.20090414.1100'
GO
USE [DBGA_DEV]
GO
/****** Object: Schema [ga] Script Date: 04/22/2009 13:21:29 ******/
CREATE SCHEMA [ga] AUTHORIZATION [dbo]
GO
EXEC sys.sp_addextendedproperty #name=N'MS_Description', #value=N'Contains the objects of the Generic Application database' , #level0type=N'SCHEMA',#level0name=N'ga'
GO
/****** Object: Table [ga].[tb_DataMeta_ObjChangeLog] Script Date: 04/22/2009 13:21:40 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [ga].[tb_DataMeta_ObjChangeLog](
[LogId] [int] IDENTITY(1,1) NOT NULL,
[TimeStamp] [timestamp] NOT NULL,
[DatabaseName] [varchar](256) NOT NULL,
[SchemaName] [varchar](256) NOT NULL,
[DbVersion] [varchar](20) NOT NULL,
[DbType] [varchar](20) NOT NULL,
[EventType] [varchar](50) NOT NULL,
[ObjectName] [varchar](256) NOT NULL,
[ObjectType] [varchar](25) NOT NULL,
[Version] [varchar](50) NULL,
[SqlCommand] [varchar](max) NOT NULL,
[EventDate] [datetime] NOT NULL,
[LoginName] [varchar](256) NOT NULL,
[FirstName] [varchar](256) NULL,
[LastName] [varchar](50) NULL,
[ChangeDescription] [varchar](1000) NULL,
[Description] [varchar](1000) NULL,
[ObjVersion] [varchar](20) NOT NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING ON
GO
EXEC sys.sp_addextendedproperty #name=N'MS_Description', #value=N'The database version as written in the extended prop of the database' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'TABLE',#level1name=N'tb_DataMeta_ObjChangeLog', #level2type=N'COLUMN',#level2name=N'DbVersion'
GO
EXEC sys.sp_addextendedproperty #name=N'MS_Description', #value=N'dev , test , qa , fb or prod' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'TABLE',#level1name=N'tb_DataMeta_ObjChangeLog', #level2type=N'COLUMN',#level2name=N'DbType'
GO
EXEC sys.sp_addextendedproperty #name=N'MS_Description', #value=N'The name of the object as it is registered in the sys.objects ' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'TABLE',#level1name=N'tb_DataMeta_ObjChangeLog', #level2type=N'COLUMN',#level2name=N'ObjectName'
GO
EXEC sys.sp_addextendedproperty #name=N'MS_Description', #value=N'' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'TABLE',#level1name=N'tb_DataMeta_ObjChangeLog', #level2type=N'COLUMN',#level2name=N'Description'
GO
SET IDENTITY_INSERT [ga].[tb_DataMeta_ObjChangeLog] ON
INSERT [ga].[tb_DataMeta_ObjChangeLog] ([LogId], [DatabaseName], [SchemaName], [DbVersion], [DbType], [EventType], [ObjectName], [ObjectType], [Version], [SqlCommand], [EventDate], [LoginName], [FirstName], [LastName], [ChangeDescription], [Description], [ObjVersion]) VALUES (3, N'DBGA_DEV', N'en', N'0.0.1.20090414.1100', N'DEV', N'DROP_TABLE', N'tb_BL_Products', N'TABLE', N' some', N'<EVENT_INSTANCE><EventType>DROP_TABLE</EventType><PostTime>2009-04-22T11:03:11.880</PostTime><SPID>57</SPID><ServerName>YSG</ServerName><LoginName>ysg\yordgeor</LoginName><UserName>dbo</UserName><DatabaseName>DBGA_DEV</DatabaseName><SchemaName>en</SchemaName><ObjectName>tb_BL_Products</ObjectName><ObjectType>TABLE</ObjectType><TSQLCommand><SetOptions ANSI_NULLS="ON" ANSI_NULL_DEFAULT="ON" ANSI_PADDING="ON" QUOTED_IDENTIFIER="ON" ENCRYPTED="FALSE"/><CommandText>drop TABLE [en].[tb_BL_Products] --<Version> some</Version>
</CommandText></TSQLCommand></EVENT_INSTANCE>', CAST(0x00009BF300B6271C AS DateTime), N'ysg\yordgeor', N'Yordan', N'Georgiev', NULL, NULL, N'0.0.0')
INSERT [ga].[tb_DataMeta_ObjChangeLog] ([LogId], [DatabaseName], [SchemaName], [DbVersion], [DbType], [EventType], [ObjectName], [ObjectType], [Version], [SqlCommand], [EventDate], [LoginName], [FirstName], [LastName], [ChangeDescription], [Description], [ObjVersion]) VALUES (4, N'DBGA_DEV', N'en', N'0.0.1.20090414.1100', N'DEV', N'CREATE_TABLE', N'tb_BL_Products', N'TABLE', N' 2.2.2 ', N'<EVENT_INSTANCE><EventType>CREATE_TABLE</EventType><PostTime>2009-04-22T11:03:18.620</PostTime><SPID>57</SPID><ServerName>YSG</ServerName><LoginName>ysg\yordgeor</LoginName><UserName>dbo</UserName><DatabaseName>DBGA_DEV</DatabaseName><SchemaName>en</SchemaName><ObjectName>tb_BL_Products</ObjectName><ObjectType>TABLE</ObjectType><TSQLCommand><SetOptions ANSI_NULLS="ON" ANSI_NULL_DEFAULT="ON" ANSI_PADDING="ON" QUOTED_IDENTIFIER="ON" ENCRYPTED="FALSE"/><CommandText>CREATE TABLE [en].[tb_BL_Products](
[ProducId] [int] NULL,
[ProductName] [nchar](10) NULL,
[ProductDescription] [varchar](5000) NULL
) ON [PRIMARY]
/*
<Version> 2.2.2 </Version>
*/
</CommandText></TSQLCommand></EVENT_INSTANCE>', CAST(0x00009BF300B62F07 AS DateTime), N'ysg\yordgeor', N'Yordan', N'Georgiev', NULL, NULL, N'0.0.0')
INSERT [ga].[tb_DataMeta_ObjChangeLog] ([LogId], [DatabaseName], [SchemaName], [DbVersion], [DbType], [EventType], [ObjectName], [ObjectType], [Version], [SqlCommand], [EventDate], [LoginName], [FirstName], [LastName], [ChangeDescription], [Description], [ObjVersion]) VALUES (5, N'DBGA_DEV', N'en', N'0.0.1.20090414.1100', N'DEV', N'DROP_TABLE', N'tb_BL_Products', N'TABLE', N' 2.2.2 ', N'<EVENT_INSTANCE><EventType>DROP_TABLE</EventType><PostTime>2009-04-22T11:25:12.620</PostTime><SPID>57</SPID><ServerName>YSG</ServerName><LoginName>ysg\yordgeor</LoginName><UserName>dbo</UserName><DatabaseName>DBGA_DEV</DatabaseName><SchemaName>en</SchemaName><ObjectName>tb_BL_Products</ObjectName><ObjectType>TABLE</ObjectType><TSQLCommand><SetOptions ANSI_NULLS="ON" ANSI_NULL_DEFAULT="ON" ANSI_PADDING="ON" QUOTED_IDENTIFIER="ON" ENCRYPTED="FALSE"/><CommandText>drop TABLE [en].[tb_BL_Products]
</CommandText></TSQLCommand></EVENT_INSTANCE>', CAST(0x00009BF300BC32F1 AS DateTime), N'ysg\yordgeor', N'Yordan', N'Georgiev', NULL, NULL, N'0.0.0')
INSERT [ga].[tb_DataMeta_ObjChangeLog] ([LogId], [DatabaseName], [SchemaName], [DbVersion], [DbType], [EventType], [ObjectName], [ObjectType], [Version], [SqlCommand], [EventDate], [LoginName], [FirstName], [LastName], [ChangeDescription], [Description], [ObjVersion]) VALUES (6, N'DBGA_DEV', N'en', N'0.0.1.20090414.1100', N'DEV', N'CREATE_TABLE', N'tb_BL_Products', N'TABLE', N' 2.2.2 ', N'<EVENT_INSTANCE><EventType>CREATE_TABLE</EventType><PostTime>2009-04-22T11:25:19.053</PostTime><SPID>57</SPID><ServerName>YSG</ServerName><LoginName>ysg\yordgeor</LoginName><UserName>dbo</UserName><DatabaseName>DBGA_DEV</DatabaseName><SchemaName>en</SchemaName><ObjectName>tb_BL_Products</ObjectName><ObjectType>TABLE</ObjectType><TSQLCommand><SetOptions ANSI_NULLS="ON" ANSI_NULL_DEFAULT="ON" ANSI_PADDING="ON" QUOTED_IDENTIFIER="ON" ENCRYPTED="FALSE"/><CommandText>CREATE TABLE [en].[tb_BL_Products](
[ProducId] [int] NULL,
[ProductName] [nchar](10) NULL,
[ProductDescription] [varchar](5000) NULL
) ON [PRIMARY]
/*
<Version> 2.2.2 </Version>
*/
</CommandText></TSQLCommand></EVENT_INSTANCE>', CAST(0x00009BF300BC3A69 AS DateTime), N'ysg\yordgeor', N'Yordan', N'Georgiev', NULL, NULL, N'0.0.0')
SET IDENTITY_INSERT [ga].[tb_DataMeta_ObjChangeLog] OFF
/****** Object: Table [ga].[tb_BLSec_LoginsForUsers] Script Date: 04/22/2009 13:21:40 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [ga].[tb_BLSec_LoginsForUsers](
[LoginsForUsersId] [int] IDENTITY(1,1) NOT NULL,
[LoginName] [nvarchar](100) NOT NULL,
[FirstName] [varchar](100) NOT NULL,
[SecondName] [varchar](100) NULL,
[LastName] [varchar](100) NOT NULL,
[DomainName] [varchar](100) NOT NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING ON
GO
SET IDENTITY_INSERT [ga].[tb_BLSec_LoginsForUsers] ON
INSERT [ga].[tb_BLSec_LoginsForUsers] ([LoginsForUsersId], [LoginName], [FirstName], [SecondName], [LastName], [DomainName]) VALUES (1, N'ysg\yordgeor', N'Yordan', N'Stanchev', N'Georgiev', N'yordgeor')
SET IDENTITY_INSERT [ga].[tb_BLSec_LoginsForUsers] OFF
/****** Object: Table [en].[tb_BL_Products] Script Date: 04/22/2009 13:21:40 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [en].[tb_BL_Products](
[ProducId] [int] NULL,
[ProductName] [nchar](10) NULL,
[ProductDescription] [varchar](5000) NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING ON
GO
/****** Object: StoredProcedure [ga].[procUtils_SqlCheatSheet] Script Date: 04/22/2009 13:21:37 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [ga].[procUtils_SqlCheatSheet]
as
set nocount on
--what was the name of the table with something like role
/*
SELECT * from sys.tables where [name] like '%POC%'
*/
-- what are the columns of this table
/*
select column_name , DATA_TYPE , CHARACTER_MAXIMUM_LENGTH, table_name from Information_schema.columns where table_name='tbGui_ExecutePOC'
*/
-- find proc
--what was the name of procedure with something like role
/*
select * from sys.procedures where [name] like '%ext%'
exec sp_HelpText procName
*/
/*
exec sp_helpText procUtils_InsertGenerator
*/
--how to list all databases in sql server
/*
SELECT database_id AS ID, NULL AS ParentID, name AS Text FROM sys.databases ORDER BY [name]
*/
--HOW-TO LIST ALL TABLES IN A SQL SERVER 2005 DATABASE
/*
SELECT TABLE_NAME FROM [POC].INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE'
AND TABLE_NAME <> 'dtproperties'
ORDER BY TABLE_NAME
*/
--HOW-TO ENABLE XP_CMDSHELL START
-------------------------------------------------------------------------
-- configure verbose mode temporarily
-- EXECUTE sp_configure 'show advanced options', 1
-- RECONFIGURE WITH OVERRIDE
--GO
--ENABLE xp_cmdshell
-- EXECUTE sp_configure 'xp_cmdshell', '1'
-- RECONFIGURE WITH OVERRIDE
-- EXEC SP_CONFIGURE 'show advanced option', '1';
-- SHOW THE CONFIGURATION
-- EXEC SP_CONFIGURE;
--turn show advance options off
-- GO
--EXECUTE sp_configure 'show advanced options', 0
-- RECONFIGURE WITH OVERRIDE
-- GO
--HOW-TO ENABLE XP_CMDSHELL END
-------------------------------------------------------------------------
--HOW-TO IMPLEMENT SLEEP
-- sleep for 10 seconds
-- WAITFOR DELAY '00:00:10' SELECT * FROM My_Table
/* LIST ALL PRIMARY KEYS
SELECT
INFORMATION_SCHEMA.TABLE_CONSTRAINTS.TABLE_NAME AS TABLE_NAME,
INFORMATION_SCHEMA.KEY_COLUMN_USAGE.COLUMN_NAME AS COLUMN_NAME,
REPLACE(INFORMATION_SCHEMA.TABLE_CONSTRAINTS.CONSTRAINT_TYPE,' ', '_') AS CONSTRAINT_TYPE
FROM
INFORMATION_SCHEMA.TABLE_CONSTRAINTS
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE ON
INFORMATION_SCHEMA.TABLE_CONSTRAINTS.CONSTRAINT_NAME =
INFORMATION_SCHEMA.KEY_COLUMN_USAGE.CONSTRAINT_NAME
WHERE
INFORMATION_SCHEMA.TABLE_CONSTRAINTS.TABLE_NAME <> N'sysdiagrams'
ORDER BY
INFORMATION_SCHEMA.TABLE_CONSTRAINTS.TABLE_NAME ASC
*/
--HOW-TO COPY TABLE AND THE WHOLE TABLE DATA , COPY TABLE FROM DB TO DB
--==================================================START
/*
use Poc_Dev
go
drop table tbGui_LinksVisibility
use POc_test
go
select *
INTO [POC_Dev].[ga].[tbGui_LinksVisibility]
from [POC_TEST].[ga].[tbGui_LinksVisibility]
*/
--HOW-TO COPY TABLE AND THE WHOLE TABLE DATA , COPY TABLE FROM DB TO DB
--====================================================END
--=================================================== SEE TABLE METADATA START
/*
SELECT c.name AS [COLUMN_NAME], sc.data_type AS [DATA_TYPE], [value] AS
[DESCRIPTION] , c.max_length as [MAX_LENGTH] , c.is_nullable AS [OPTIONAL]
, c.is_identity AS [IS_PRIMARY_KEY] FROM sys.extended_properties AS ep
INNER JOIN sys.tables AS t ON ep.major_id = t.object_id
INNER JOIN sys.columns AS c ON ep.major_id = c.object_id AND ep.minor_id
= c.column_id
INNER JOIN INFORMATION_SCHEMA.COLUMNS sc ON t.name = sc.table_name and
c.name = sc.column_name
WHERE class = 1 and t.name = 'tbGui_ExecutePOC' ORDER BY SC.DATA_TYPE
*/
--=================================================== SEE TABLE METADATA END
/*
select * from Information_schema.columns
select table_name , column_name from Information_schema.columns where table_name='tbGui_Wizards'
*/
--=================================================== LIST ALL TABLES AND THEIR DESCRIPTOINS START
/*
SELECT T.name AS TableName, CAST(Props.value AS varchar(1000)) AS
TableDescription
FROM sys.tables AS T LEFT OUTER JOIN
(SELECT class, class_desc, major_id, minor_id,
name, value
FROM sys.extended_properties
WHERE (minor_id = 0) AND (class = 1)) AS
Props ON T.object_id = Props.major_id
WHERE (T.type = 'U') AND (T.name <> N'sysdiagrams')
ORDER BY TableName
*/
--=================================================== LIST ALL TABLES AND THEIR DESCRIPTOINS START
--=================================================== LIST ALL OBJECTS FROM DB START
/*
use DB
--HOW-TO LIST ALL PROCEDURE IN A DATABASE
select s.name from sysobjects s where type = 'P'
--HOW-TO LIST ALL TRIGGERS BY NAME IN A DATABASE
select s.name from sysobjects s where type = 'TR'
--HOW-TO LIST TABLES IN A DATABASE
select s.name from sysobjects s where type = 'U'
--how-to list all system tables in a database
select s.name from sysobjects s where type = 's'
--how-to list all the views in a database
select s.name from sysobjects s where type = 'v'
*/
/*
Similarly you can find out other objects created by user, simple change type =
C = CHECK constraint
D = Default or DEFAULT constraint
F = FOREIGN KEY constraint
L = Log
FN = Scalar function
IF = In-lined table-function
P = Stored procedure
PK = PRIMARY KEY constraint (type is K)
RF = Replication filter stored procedure
S = System table
TF = Table function
TR = Trigger
U = User table ( this is the one I discussed above in the example)
UQ = UNIQUE constraint (type is K)
V = View
X = Extended stored procedure
*/
--=================================================== HOW-TO SEE ALL MY PERMISSIONS START
/*
SELECT * FROM fn_my_permissions(NULL, 'SERVER');
USE poc_qa;
SELECT * FROM fn_my_permissions (NULL, 'database');
GO
*/
--=================================================== HOW-TO SEE ALL MY PERMISSIONS END
/*
--find table
use poc_dev
go
select s.name from sysobjects s where type = 'u' and s.name like '%Visibility%'
select * from tbGui_LinksVisibility
*/
/* find cursor
use poc
go
DECLARE #procName varchar(100)
DECLARE #cursorProcNames CURSOR
SET #cursorProcNames = CURSOR FOR
select name from sys.procedures where modify_date > '2009-02-05 13:12:15.273' order by modify_date desc
OPEN #cursorProcNames
FETCH NEXT
FROM #cursorProcNames INTO #procName
WHILE ##FETCH_STATUS = 0
BEGIN
set nocount off;
exec sp_HelpText #procName --- or print them
-- print #procName
FETCH NEXT
FROM #cursorProcNames INTO #procName
END
CLOSE #cursorProcNames
select ##error
*/
/* -- SEE STORED PROCEDURE EXT PROPS
SELECT ep.name as 'EXT_PROP_NAME' , SP.NAME , [value] as 'DESCRIPTION' FROM sys.extended_properties as ep left join sys.procedures as sp on sp.object_id = ep.major_id where sp.type='P'
-- what the hell I ve been doing lately on sql server 2005 / 2008
select o.name ,
(SELECT [definition] AS [text()] FROM sys.all_sql_modules where sys.all_sql_modules.object_id=a.object_id FOR XML PATH(''), TYPE) AS Statement_Text
, a.object_id, o.modify_date from sys.all_sql_modules a left join sys.objects o on a.object_id=o.object_id order by 4 desc
-- GET THE RIGHT LANG SCHEMA START
DECLARE #template AS varchar(max)
SET #template = 'SELECT * FROM {object_name}'
DECLARE #object_name AS sysname
SELECT #object_name = QUOTENAME(s.name) + '.' + QUOTENAME(o.name)
FROM sys.objects o
INNER JOIN sys.schemas s
ON s.schema_id = o.schema_id
WHERE o.object_id = OBJECT_ID(QUOTENAME(#LANG) + '.[TestingLanguagesInNameSpacesDelMe]')
IF #object_name IS NOT NULL
BEGIN
DECLARE #sql AS varchar(max)
SET #sql = REPLACE(#template, '{object_name}', #object_name)
EXEC (#sql)
END
-- GET THE RIGHT LANG SCHEMA END
-- SEE STORED PROCEDURE EXT PROPS end*/
set nocount off
GO
EXEC sys.sp_addextendedproperty #name=N'AuthorName', #value=N'Yordan Georgiev' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'PROCEDURE',#level1name=N'procUtils_SqlCheatSheet'
GO
EXEC sys.sp_addextendedproperty #name=N'ProcDescription', #value=N'TODO:ADD HERE DESCRPIPTION' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'PROCEDURE',#level1name=N'procUtils_SqlCheatSheet'
GO
EXEC sys.sp_addextendedproperty #name=N'ProcVersion', #value=N'0.1.0.20090406.1317' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'PROCEDURE',#level1name=N'procUtils_SqlCheatSheet'
GO
/****** Object: UserDefinedFunction [ga].[GetDbVersion] Script Date: 04/22/2009 13:21:42 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION [ga].[GetDbVersion]()
RETURNS VARCHAR(20)
BEGIN
RETURN convert(varchar(20) , (select value from sys.extended_properties where name='DbVersion' and class_desc='DATABASE') )
END
GO
EXEC sys.sp_addextendedproperty #name=N'AuthorName', #value=N'Yordan Georgiev' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'FUNCTION',#level1name=N'GetDbVersion'
GO
EXEC sys.sp_addextendedproperty #name=N'ChangeDescription', #value=N'Initial creation' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'FUNCTION',#level1name=N'GetDbVersion'
GO
EXEC sys.sp_addextendedproperty #name=N'CreatedWhen', #value=N'getDate()' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'FUNCTION',#level1name=N'GetDbVersion'
GO
EXEC sys.sp_addextendedproperty #name=N'Description', #value=N'Gets the current version of the database ' , #level0type=N'SCHEMA',#level0name=N'ga', #level1type=N'FUNCTION',#level1name=N'GetDbVersion'
GO
/****** Object: UserDefinedFunction [ga].[GetDbType] Script Date: 04/22/2009 13:21:42 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION [ga].[GetDbType]()
RETURNS VARCHAR(30)
BEGIN
RETURN convert(varchar(30) , (select value from sys.extended_properties where name='DbType' and class_desc='DATABASE') )
END
GO
/****** Object: Default [DF_tb_DataMeta_ObjChangeLog_DbVersion] Script Date: 04/22/2009 13:21:40 ******/
ALTER TABLE [ga].[tb_DataMeta_ObjChangeLog] ADD CONSTRAINT [DF_tb_DataMeta_ObjChangeLog_DbVersion] DEFAULT ('select ga.GetDbVersion()') FOR [DbVersion]
GO
/****** Object: Default [DF_tb_DataMeta_ObjChangeLog_EventDate] Script Date: 04/22/2009 13:21:40 ******/
ALTER TABLE [ga].[tb_DataMeta_ObjChangeLog] ADD CONSTRAINT [DF_tb_DataMeta_ObjChangeLog_EventDate] DEFAULT (getdate()) FOR [EventDate]
GO
/****** Object: Default [DF_tb_DataMeta_ObjChangeLog_ObjVersion] Script Date: 04/22/2009 13:21:40 ******/
ALTER TABLE [ga].[tb_DataMeta_ObjChangeLog] ADD CONSTRAINT [DF_tb_DataMeta_ObjChangeLog_ObjVersion] DEFAULT ('0.0.0') FOR [ObjVersion]
GO
/****** Object: DdlTrigger [trigMetaDoc_TraceDbChanges] Script Date: 04/22/2009 13:21:29 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
create trigger [trigMetaDoc_TraceDbChanges]
on database
for create_procedure, alter_procedure, drop_procedure,
create_table, alter_table, drop_table,
create_function, alter_function, drop_function ,
create_trigger , alter_trigger , drop_trigger
as
set nocount on
declare #data xml
set #data = EVENTDATA()
declare #DbVersion varchar(20)
set #DbVersion =(select ga.GetDbVersion())
declare #DbType varchar(20)
set #DbType =(select ga.GetDbType())
declare #DbName varchar(256)
set #DbName =#data.value('(/EVENT_INSTANCE/DatabaseName)[1]', 'varchar(256)')
declare #EventType varchar(256)
set #EventType =#data.value('(/EVENT_INSTANCE/EventType)[1]', 'varchar(50)')
declare #ObjectName varchar(256)
set #ObjectName = #data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'varchar(256)')
declare #ObjectType varchar(25)
set #ObjectType = #data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'varchar(25)')
declare #TSQLCommand varchar(max)
set #TSQLCommand = #data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'varchar(max)')
declare #opentag varchar(4)
set #opentag= '<'
declare #closetag varchar(4)
set #closetag= '>'
declare #newDataTxt varchar(max)
set #newDataTxt= cast(#data as varchar(max))
set #newDataTxt = REPLACE ( REPLACE(#newDataTxt , #opentag , '<') , #closetag , '>')
-- print #newDataTxt
declare #newDataXml xml
set #newDataXml = CONVERT ( xml , #newDataTxt)
declare #Version varchar(50)
set #Version = #newDataXml.value('(/EVENT_INSTANCE/TSQLCommand/CommandText/Version)[1]', 'varchar(50)')
-- if we are dropping take the version from the existing object
if ( SUBSTRING(#EventType , 0 , 5)) = 'DROP'
set #Version =( select top 1 [Version] from ga.tb_DataMeta_ObjChangeLog where ObjectName=#ObjectName order by [LogId] desc)
declare #LoginName varchar(256)
set #LoginName = #data.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(256)')
declare #FirstName varchar(50)
set #FirstName= (select [FirstName] from [ga].[tb_BLSec_LoginsForUsers] where [LoginName] = #LoginName)
declare #LastName varchar(50)
set #LastName = (select [LastName] from [ga].[tb_BLSec_LoginsForUsers] where [LoginName] = #LoginName)
declare #SchemaName sysname
set #SchemaName = #data.value('(/EVENT_INSTANCE/SchemaName)[1]', 'sysname');
--declare #Description xml
--set #Description = #data.query('(/EVENT_INSTANCE/TSQLCommand/text())')
--print 'VERSION IS ' + #Version
--print #newDataTxt
--print cast(#data as varchar(max))
-- select column_name from information_schema.columns where table_name ='tb_DataMeta_ObjChangeLog'
insert into [ga].[tb_DataMeta_ObjChangeLog]
(
[DatabaseName] ,
[SchemaName],
[DbVersion] ,
[DbType],
[EventType],
[ObjectName],
[ObjectType] ,
[Version],
[SqlCommand] ,
[LoginName] ,
[FirstName],
[LastName]
)
values(
#DbName,
#SchemaName,
#DbVersion,
#DbType,
#EventType,
#ObjectName,
#ObjectType ,
#Version,
#newDataTxt,
#LoginName ,
#FirstName ,
#LastName
)
GO
SET ANSI_NULLS OFF
GO
SET QUOTED_IDENTIFIER OFF
GO
DISABLE TRIGGER [trigMetaDoc_TraceDbChanges] ON DATABASE
GO
/****** Object: DdlTrigger [trigMetaDoc_TraceDbChanges] Script Date: 04/22/2009 13:21:29 ******/
Enable Trigger [trigMetaDoc_TraceDbChanges] ON Database
GO

Resources