I created a trigger at the level of the server to control when a db is created.
I have this script that was working fine on SQL 2014, now we moved to SQL 2017, the script is working but I receive lot of emails
CREATE TRIGGER [ddl_trig_database]
ON ALL SERVER
FOR ALTER_DATABASE
AS
DECLARE #results NVARCHAR(max)
DECLARE #subjectText NVARCHAR(max)
DECLARE #databaseName NVARCHAR(255)
SET #subjectText = 'NEW DATABASE Created on ' + ##SERVERNAME + ' by ' + SUSER_SNAME()
SET #results = (SELECT EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)'))
SET #databaseName = (SELECT EVENTDATA().value('(/EVENT_INSTANCE/DatabaseName)[1]', 'VARCHAR(255)'))
EXEC msdb.dbo.sp_send_dbmail
#profile_name = 'EmailProfile',
#recipients = 'test#domain.com',
#body = #results,
#subject = #subjectText,
#exclude_query_output = 1 --Suppress 'Mail Queued' message
GO
I receive for example in different emails each of these lines:
ALTER DATABASE [testNewDB] SET DELAYED_DURABILITY = DISABLED
ALTER DATABASE [testNewDB] SET RECOVERY FULL
ALTER DATABASE [testNewDB] SET READ_WRITE
ALTER DATABASE [testNewDB] SET READ_COMMITTED_SNAPSHOT OFF
There are more so I believe the trigger is sending the info for each configuration parameter of the new db created, any idea how to receive only the info of the new DB created without all the rest?
You can replace ALTER_DATABASE with CREATE_DATABASE, but this will not catch a restore event because a restore does not generate a DLL event.
CREATE TRIGGER [ddl_trig_database]
ON ALL SERVER
FOR CREATE_DATABASE
AS
The following article covers a solution that will work around the missing DDL event:
DDL triggers enable us to audit DDL changes but there are a few
missing events, design decisions and installation complications. This
post explains and provides a full solution that includes auditing for
database restores (there is no DDL event for this) and an incremental
self install, which keeps the whole server audit configured for DDL
auditing.
https://www.sqlservercentral.com/forums/topic/sql-2008-ddl-auditing-a-full-self-installingupdating-solution-for-whole-server
The solution in the article for RESTORE events involves a job that runs to check for new databases:
SQL 2008 Audit RESTORE DATABASE
SQL Agent job which runs (in less than 1 second) every 1 minute to
copy new restore database auditing information from
msdb.dbo.restorehistory to dbadata.dbo.ServerAudit. If it finds that a
database restore has happened but has not been audited it
automatically runs the “Setup DDL Audit” job because there is a
possibility that the restored database is not configured for DDL
auditing as expected.
I have a database server that some databases with restricted users are in use in the database. I need to restrict users to can't change .MDF and .LDF autogrowth settings. Please guide me to restrict the users.
I think there is two way to get this access:
Disable autogrowth in databases
Limit the maximum size of MDF and LDF
But I couldn't find any option in Management Studio to do them server wide and also get access from users.
Thanks.
you can execute following ALTER DATABASE command which sets auto growth option to off for all databases using undocumented stored procedure sp_Msforeachdb
for single database (Parallel Data Warehouse instances only)
ALTER DATABASE [database_name] SET AUTOGROW = OFF
for all databases
EXEC sp_Msforeachdb "ALTER DATABASE [?] SET AUTOGROW = OFF"
Although this is not a server variable or instance settings, it might help you ease your task for updating all databases on the SQL Server instance
By excluding system databases and for all other databases, following T-SQL can be executed to get list of all database files and output commands prepared can be executed
select
'ALTER DATABASE [' + db_name(database_id) + '] MODIFY FILE ( NAME = N''' + name + ''', FILEGROWTH = 0)'
from sys.master_files
where database_id > 4
To prevent data files' autogrow property to be changed, I prepared below SQL Server DDL trigger once I used a DDL trigger for logging DROP table statements.
Following trigger will also prevent you to change this property, so if you need to update this property, you have to drop this trigger first.
CREATE TRIGGER prevent_filegrowth
ON ALL SERVER
FOR ALTER_DATABASE
AS
declare #SqlCommand nvarchar(max)
set #SqlCommand = ( SELECT EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]','nvarchar(max)') );
if( isnull(charindex('FILEGROWTH', #SqlCommand), 0) > 0 )
begin
RAISERROR ('FILEGROWTH property cannot be altered', 16, 1)
ROLLBACK
end
GO
For more on DDL Triggers, please refer to Microsoft Docs
I have a SQL Server 2005 DB project and am looking to deploy the Schema over an existing DB that is on a later version of SQL Server. The issue I have is that Change Tracking is enabled on the DB I wish to deploy to and so the first thing SSDT wants to do is disable CT. This poses a problem as I get the error below:
(43,1): SQL72014: .Net SqlClient Data Provider: Msg 22115, Level 16,
State 1, Line 5 Change tracking is enabled for one or more tables in
database 'Test'. Disable change tracking on each table before
disabling it for the database. Use the sys.change_tracking_tables
catalog view to obtain a list of tables for which change tracking is
enabled. (39,0): SQL72045: Script execution error. The executed
script:
IF EXISTS (SELECT 1
FROM [master].[dbo].[sysdatabases]
WHERE [name] = N'$(DatabaseName)')
BEGIN
ALTER DATABASE [$(DatabaseName)]
SET CHANGE_TRACKING = OFF
WITH ROLLBACK IMMEDIATE;
END
In an effort to get around this I have created a PreDeployment script that executes the below:
/* Run pre-deployment scripts to resolve issues */
IF(SELECT SUBSTRING(##VERSION, 29,4)) = '11.0'
BEGIN
PRINT 'Enabling Change Tracking';
DECLARE #dbname VARCHAR(250)
SELECT #dbname = DB_NAME()
EXEC('
IF NOT EXISTS(SELECT * FROM [master].[dbo].[sysdatabases] WHERE name = ''' + #dbname + ''')
ALTER DATABASE ['+ #dbname +
']SET CHANGE_TRACKING = ON
(CHANGE_RETENTION = 5 DAYS, AUTO_CLEANUP = ON);
');
EXEC('
IF NOT EXISTS(SELECT * FROM sys.change_tracking_tables ctt
INNER JOIN sys.tables t ON t.object_id = ctt.object_id
INNER JOIN sys.schemas s ON s.schema_id = t.schema_id
WHERE t.name = ''TableName'')
BEGIN
ALTER TABLE [dbo].[TableName] ENABLE CHANGE_TRACKING;
END;');
So based on the DB Version Change Tracking is set to enabled on the DB and relevant Tables assuming it is not already enabled.I got this idea from a previous post: # ifdef type conditional compilation in T-SQL sql server 2008 2005
Unfortunately this is still not working as SSDT is trying to disable Change Tracking before the PreDeployment script is executed.
Make sure change tracking is enabled in your database project.
Open your database project's properties > Project Settings > Database Settings... > Operational tab > check the "Change tracking" option
As Keith said if you want it in enable it. If you do want to disable it then just run your script before doing the compare so you have a pre-pre-deploy script like:
https://the.agilesql.club/Blog/Ed-Elliott/Pre-Compare-and-Pre-Deploy-Scripts-In-SSDT
If you are disabling it then it is a one off thing so pretty simple.
Other options are to write your own deployment contributor and raising a bug via connect.
Deployment Contributor:
https://the.agilesql.club/blog/Ed-Elliott/2015/09/23/Inside-A-SSDT-Deployment-Contributor
https://github.com/DacFxDeploymentContributors/Contributors
Ed
The following is the start if the standard Install Northwind sql script as provided by Microsoft in their Install sql server 200 sample databases.
SET NOCOUNT ON
GO
USE master
GO
if exists (select * from sysdatabases where name='Northwind')
drop database Northwind
go
DECLARE #device_directory NVARCHAR(520)
SELECT #device_directory = SUBSTRING(filename, 1, CHARINDEX(N'master.mdf', LOWER(filename)) - 1)
FROM master.dbo.sysaltfiles WHERE dbid = 1 AND fileid = 1
EXECUTE (N'CREATE DATABASE Northwind
ON PRIMARY (NAME = N''Northwind'', FILENAME = N''' + #device_directory + N'northwnd.mdf'')
LOG ON (NAME = N''Northwind_log'', FILENAME = N''' + #device_directory + N'northwnd.ldf'')')
go
exec sp_dboption 'Northwind','trunc. log on chkpt.','true'
exec sp_dboption 'Northwind','select into/bulkcopy','true'
GO
set quoted_identifier on
GO
Under normal circumstances I always use a full copy of sql server or sql server express for development, however an unrelated support issue with a third party component has occurred that requires me to provide a self contained sample application with the basic nortwind database file contained within the sample using localdb.
To that end how should I adapt the Execute Create Database section of the sql script so that it will create a copy of the nortwind .mdf in a given location (let's say C:\MyData so that I can then use, that file to send with the sample I need to build for the support team. Essentially it is vital that they have a completely self contained sample to help narrow down the problem.
Many Thanks
I have a SQL database that I am currently converting from an Access database. One of the features of the Access database is to 'Copy DB' and is used when working in a 'dev' site - it copies all of the production data by physically copying the production file into the dev site. In this way, all production data and structure and queries and everything is duplicated into the dev version. This only needs to be done occasionally when we are testing why something is occurring and we don't want to play in production data; not rare, but not frequent either - perhaps once or twice a month. I'm wondering what have other people done to accomplish this when working in SQL?
I'm thinking that I could do a DB backup followed by a restore to the DEV version, but I don't want this backup to interfere with normal backup processes. Is there any way to do this from one DB straight to another instead of going to the file system, and have the backup seem like it never happened (i.e. the REAL backup will still back up all items that really needed to be backed up)?
What other options are there? I have SQL Compare and SQL Data Compare from Red Gate, but I need to expose this functionality to certain users (with high privs and access to the DEV site), and they don't have it.
Ok well after looking around a bit, I've come to the conclusion that I do have to go thru the file system, but there is a way to do a backup/restore without affecting the normal backup process by using 'Copy Only' mode. Here's script to do it:
BACKUP DATABASE [ProductionDB]
TO DISK = N'D:\ProductionDBToDevTransfer.bak'
WITH
COPY_ONLY,
NOFORMAT,
INIT,
NAME = N'DB-Full Backup',
SKIP,
NOREWIND,
NOUNLOAD,
STATS = 10
RESTORE DATABASE [DevDB]
FROM DISK = N'D:\ProductionDBToDevTransfer.bak'
WITH
FILE = 1,
MOVE N'ProductionDB' TO N'D:\Microsoft\SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\DevDB.mdf',
MOVE N'ProductionDB_log' TO N'D:\Microsoft\SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\DevDB_log.ldf',
NOUNLOAD,
REPLACE,
STATS = 10
Take particular note of the MOVE statements in the RESTORE command... by default, the RESTORE will restore files to the originally backed up physical files, NOT the Dev DB files, despite the fact that you are restoring to the DEV db... I ALMOST found out the hard way, when I did the restore, and SSMS complained that the files were in use by ANOTHER db... OMG, how non-intuitive.
The script above works, but doesn't change the logical file name for the copied server. So, if you try and run it again to reverse the process it will fail in the MOVE statements.
I modified the script a little and came up with the following which seems to work for me. I'm new to this, so be careful!
DECLARE #SOURCEDB nvarchar(100)
DECLARE #SOURCEDBLOG nvarchar(100)
DECLARE #DESTINATIONDB nvarchar(100)
DECLARE #DESTINATIONDBLOG nvarchar(100)
DECLARE #BACKUPDIR nvarchar(100)
DECLARE #BACKUPFILE nvarchar(100)
DECLARE #BACKUPNAME nvarchar(100)
DECLARE #SQLDATADIR nvarchar(100)
DECLARE #SQLDATABACKUPFILE nvarchar(100)
DECLARE #SQLDATABACKUPLOGFILE nvarchar(100)
--CHANGE THESE VALUES TO MATCH YOUR SYSTEM
SET #SOURCEDB = N'test'
SET #DESTINATIONDB = N'test-backup'
SET #BACKUPDIR = N'C:\SHARED\'
SET #SQLDATADIR = N'C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\'
--CALCULATED VALUES
SET #SOURCEDBLOG = #SOURCEDB + N'_log'
SET #DESTINATIONDBLOG = #DESTINATIONDB + N'_log'
SET #BACKUPFILE = #BACKUPDIR + #SOURCEDB + N'-to-' + #DESTINATIONDB + N'.bak'
SET #BACKUPNAME = #SOURCEDB + N'-Full Backup'
SET #SQLDATABACKUPFILE = #SQLDATADIR + #DESTINATIONDB + N'.mdf'
SET #SQLDATABACKUPLOGFILE = #SQLDATADIR + #DESTINATIONDBLOG + N'.ldf'
--BACKUP THE DATABASE
BACKUP DATABASE #SOURCEDB
TO DISK = #BACKUPFILE
WITH
COPY_ONLY,
NOFORMAT,
INIT,
NAME = #BACKUPNAME,
SKIP,
NOREWIND,
NOUNLOAD,
STATS = 10
--RESTORE THE BACKUP TO THE NEW DATABASE NAME
RESTORE DATABASE #DESTINATIONDB
FROM DISK = #BACKUPFILE
WITH
FILE = 1,
MOVE #SOURCEDB TO #SQLDATABACKUPFILE,
MOVE #SOURCEDBLOG TO #SQLDATABACKUPLOGFILE,
NOUNLOAD,
REPLACE,
STATS = 10
--UPDATE THE LOGICAL FILE NAMES
DECLARE #TEMPLATE varchar(500)
DECLARE #SCRIPT varchar(500)
SET #TEMPLATE = N'ALTER DATABASE [{DBNAME}] MODIFY FILE (NAME = [{OLD}], NEWNAME = [{NEW}])'
SET #SCRIPT = REPLACE(REPLACE(REPLACE(#TEMPLATE, '{DBNAME}', #DESTINATIONDB),'{OLD}',#SOURCEDB),'{NEW}',#DESTINATIONDB)
EXEC(#SCRIPT)
SET #SCRIPT = REPLACE(REPLACE(REPLACE(#TEMPLATE, '{DBNAME}', #DESTINATIONDB),'{OLD}',#SOURCEDBLOG),'{NEW}',#DESTINATIONDBLOG)
EXEC(#SCRIPT)
You can restore a database directly from another database.
If you're using SQL Management Studio, select "From database" instead of "From device" in the Restore Database dialog box.
You generally want to restore a whole database from a backup. Trying to do it directly from a live running prod database could cause locking issues for your users. You can do this using SSIS, but it is neither a simple nor quick thing to set up properly.
Another possibility is if you can turn off prod for a time being (only if you have a time period when users are not inthe database). Then detach the database. Detach the dev database and delete it. Copy the file to the dev server and attach both databases again. This can be faster than a restore, but it is a rare environment now that doesn't have 24 hour data access on production.
Incidentally it is very much preferable to have dev and prod on separate servers.
And if you are restoring to dev, you need to make sure any dev changes that have not yet been committed to prod are scripted, so they can be run immediately after the restore. It's best if you script any and all database changes and store them in source control. That makes it easier to do this.
We do an on-demand backup of the production data, and then restore the backup on the dev machine.