Avoid hardcoding when changing values in SQL Server trigger - sql-server

I have a sql server "instead of insert" trigger that populates a single column (PromoCode). It all works perfectly, but I don't like the fact I have had to hardcode the columns in actual INSERT statement:
CREATE TRIGGER PopulateOrderPromoCode ON Order
INSTEAD OF INSERT
AS BEGIN
--// Get the Promo Code
DECLARE #PromoCode int;
EXEC GetPromoCode #PromoCode OUTPUT;
--// Insert the order with the new Promo Code
INSERT INTO Order (Id, CustomerId, PromoCode)
SELECT Id, CustomerId, #PromoCode FROM inserted;
END
I would prefer to simply replace the value inside inserted.PromoCode with #PromoCode and then could use:
INSERT INTO Order
SELECT * FROM inserted;
Can this be done?

Don't use an INSTEAD OF INSERT trigger (in which you have to take over the insert logic)
Use a normal INSERT trigger (which allows you to do stuff in addition to the insert)
This assumes you can insert without a promo code (allows nulls) or the promo code defaults to something.
CREATE TRIGGER PopulateOrderPromoCode ON Order
FOR INSERT
AS
BEGIN
--// Get the Promo Code
DECLARE #PromoCode int;
EXEC GetPromoCode #PromoCode OUTPUT;
--// update the order with the new Promo Code
UPDATE Order SET PromoCode = #PromoCode
WHERE ID IN (SELECT ID FROM inserted)
END

INSERTED is a read only temporary table which can be accessed in any trigger. You can not modify it.
And the way you are using to perform an INSERT is the best way. So, there's nothing wrong with it. Its good to specify the columns while performing an INSERT (according to me).

Dynamic SQL would be your only other option. Try this:
CREATE TRIGGER PopulateOrderPromoCode
ON Order
INSTEAD OF INSERT
AS
BEGIN
--// Get the Promo Code
DECLARE #PromoCode int;
EXEC GetPromoCode #PromoCode OUTPUT;
DECLARE #InsertSQL nvarchar(2000), #SelectSQL nvarchar(2000)
SET #InsertSQL = 'INSERT INTO Order ('
SET #SelectSQL = 'SELECT '
DECLARE #CurrentCol sysname
SET #CurrentCol = ''
WHILE EXISTS ( SELECT TOP 1 QUOTENAME(name)
FROM sys.syscolumns
WHERE object_name(id) = 'Order'
AND name <> 'PromoCode'
AND name > #CurrentCol)
BEGIN
SET #CurrentCol = ( SELECT TOP 1 QUOTENAME(name)
FROM sys.syscolumns
WHERE object_name(id) = 'Order'
AND name <> 'PromoCode'
AND QUOTENAME(name) > #CurrentCol
ORDER BY name)
IF #CurrentCol IS NULL Break;
SET #InsertSQL = #InsertSQL + #CurrentCol + ', '
SET #SelectSQL = #SelectSQL + #CurrentCol + ', '
END
--Finish and concatenate the strings
SET #InsertSQL = #InsertSQL + 'PromoCode) '
SET #SelectSQL = #SelectSQL + '''' + #PromoCode + '''' + ' FROM INSERTED'
DECLARE #MasterSQL nvarchar(2000)
SET #MasterSQL = #InsertSQL + #SelectSQL
EXEC (#MasterSQL)
END
BTW - "order" is a poor choice for a table name - it's also a reserved word in SQL. Try Orders or OrderHeader.

Related

Dynamic Database Stored Procedure on SQL Server 2016

I'm trying to build a stored procedure that will query multiple database depending on the databases required.
For example:
SP_Users takes a list of #DATABASES as parameters.
For each database it needs to run the same query and union the results together.
I believe a CTE could be my best bet so I have something like this at the moment.
SET #DATABASES = 'DB_1, DB_2' -- Two databases in a string listed
-- I have a split string function that will extract each database
SET #CURRENT_DB = 'DB_1'
WITH UsersCTE (Name, Email)
AS (SELECT Name, Email
FROM [#CURRENT_DB].[dbo].Users),
SELECT #DATABASE as DB, Name, Email
FROM UsersCTE
What I don't want to do is hard code the databases in the query. The steps I image are:
Split the parameter #DATABASES to extract and set the #CURRENT_DB Variable
Iterate through the query with a Recursive CTE until all the #DATABASES have been processed
Union all results together and return the data.
Not sure if this is the right approach to tackling this problem.
Using #databases:
As mentioned in the comments to your question, variables cant be used to dynamically select a database. Dynamic sql is indicated. You can start by building your template sql statement:
declare #sql nvarchar(max) =
'union all ' +
'select ''#db'' as db, name, email ' +
'from [#db].dbo.users ';
Since you have sql server 2016, you can split using the string_split function, with your #databases variable as input. This will result in a table with 'value' as the column name, which holds the database names.
Use the replace function to replace #db in the template with value. This will result in one sql statement for each database you passed into #databases. Then, concatenate the statements back together. Unfortunately, in version 2016, there's no built in function to do that. So we have to use the famous for xml trick to join the statements, then we use .value to convert it to a string, and finally we use stuff to get rid of the leading union all statement.
Take the results of the concatenated output, and overwrite the #sql variable. It is ready to go at this point, so execute it.
I do all that is described in this code:
declare #databases nvarchar(max) = 'db_1,db_2';
set #sql = stuff(
(
select replace(#sql, '#db', value)
from string_split(#databases, ',')
for xml path(''), type
).value('.[1]', 'nvarchar(max)')
, 1, 9, '');
exec(#sql);
Untested, of course, but if you print instead of execute, it seems to give the proper sql statement for your needs.
Using msForEachDB:
Now, if you didn't want to have to know which databases had 'users', such as if you're in an environment where you have a different database for every client, you can use sp_msForEachDb and check the structure first to make sure it has a 'users' table with 'name' and 'email' columns. If so, execute the appropriate statement. If not, execute a dummy statement. I won't describe this one, I'll just give the code:
declare #aggregator table (
db sysname,
name int,
email nvarchar(255)
);
insert #aggregator
exec sp_msforeachdb '
declare #sql nvarchar(max) = ''select db = '''''''', name = '''''''', email = '''''''' where 1 = 2'';
select #sql = ''select db = ''''?'''', name, email from ['' + table_catalog + ''].dbo.users''
from [?].information_schema.columns
where table_schema = ''dbo''
and table_name = ''users''
and column_name in (''name'', ''email'')
group by table_catalog
having count(*) = 2
exec (#sql);
';
select *
from #aggregator
I took the valid advice from others here and went with this which works great for what I need:
I decided to use a loop to build the query up. Hope this helps someone else looking to do something similar.
CREATE PROCEDURE [dbo].[SP_Users](
#DATABASES VARCHAR(MAX) = NULL,
#PARAM1 VARCHAR(250),
#PARAM2 VARCHAR(250)
)
BEGIN
SET NOCOUNT ON;
--Local variables
DECLARE
#COUNTER INT = 0,
#SQL NVARCHAR(MAX) = '',
#CURRENTDB VARCHAR(50) = NULL,
#MAX INT = 0,
#ERRORMSG VARCHAR(MAX)
--Check we have databases entered
IF #DATABASES IS NULL
BEGIN
RAISERROR('ERROR: No Databases Provided,
Please Provide a list of databases to execute procedure. See stored procedure:
[SP_Users]', 16, 1)
RETURN
END
-- SET Number of iterations based on number of returned databases
SET #MAX = (SELECT COUNT(*) FROM
(SELECT ROW_NUMBER() OVER (ORDER BY i.value) AS RowNumber, i.value
FROM dbo.udf_SplitVariable(#DATABASES, ',') AS i)X)
-- Build SQL Statement
WHILE #COUNTER < #MAX
BEGIN
--Set the current database
SET #CURRENTDB = (SELECT X.Value FROM
(SELECT ROW_NUMBER() OVER (ORDER BY i.value) AS RowNumber, i.value
FROM dbo.udf_SplitVariable(#DATABASES, ',') AS i
ORDER BY RowNumber OFFSET #COUNTER
ROWS FETCH NEXT 1 ROWS ONLY) X);
SET #SQL = #SQL + N'
(
SELECT Name, Email
FROM [' + #CURRENTDB + '].[dbo].Users
WHERE
(Name = #PARAM1 OR #PARAM1 IS NULL)
(Email = #PARAM2 OR #PARAM2 IS NULL)
) '
+ N' UNION ALL '
END
PRINT #CURRENTDB
PRINT #SQL
SET #COUNTER = #COUNTER + 1
END
-- remove last N' UNION ALL '
IF LEN(#SQL) > 11
SET #SQL = LEFT(#SQL, LEN(#SQL) - 11)
EXEC sp_executesql #SQL, N'#CURRENTDB VARCHAR(50),
#PARAM1 VARCHAR(250),
#PARAM2 VARCHAR(250)',
#CURRENTDB,
#PARAM1 ,
#PARAM2
END
Split Variable Function
CREATE FUNCTION [dbo].[udf_SplitVariable]
(
#List varchar(8000),
#SplitOn varchar(5) = ','
)
RETURNS #RtnValue TABLE
(
Id INT IDENTITY(1,1),
Value VARCHAR(8000)
)
AS
BEGIN
--Account for ticks
SET #List = (REPLACE(#List, '''', ''))
--Account for 'emptynull'
IF LTRIM(RTRIM(#List)) = 'emptynull'
BEGIN
SET #List = ''
END
--Loop through all of the items in the string and add records for each item
WHILE (CHARINDEX(#SplitOn,#List)>0)
BEGIN
INSERT INTO #RtnValue (value)
SELECT Value = LTRIM(RTRIM(SUBSTRING(#List, 1, CHARINDEX(#SplitOn, #List)-1)))
SET #List = SUBSTRING(#List, CHARINDEX(#SplitOn,#List) + LEN(#SplitOn), LEN(#List))
END
INSERT INTO #RtnValue (Value)
SELECT Value = LTRIM(RTRIM(#List))
RETURN
END

Logging data changes into table with dynamically changing name in MS SQL

I am trying to log data changes in MS SQL with trigger. I want to create a new History table in every month. After I found the answer how to change table name Dynamically I can't access the DELETED and INSERTED tables anymore. It says invalid object name.
ALTER TRIGGER [dbo].[teszttablatrigger] ON [teszt].[dbo].[teszt] FOR DELETE, INSERT, UPDATE AS
declare #hist nvarchar(40)
set #hist='teszthistory_' + CAST(YEAR(getdate()) as NCHAR(4))+ '_' + (case when Month(GETDATE())<10 then '0' + CAST (Month(GETDATE()) as NCHAR(1))
when Month(GETDATE())>=10 then CAST (Month(GETDATE()) as NCHAR(2)) end)
declare #DynamicSql1 nvarchar(2000)
declare #DynamicSql2 nvarchar(2000)
set #DynamicSql1 = N'IF NOT EXISTS (SELECT * FROM sysobjects WHERE id = object_id(N''[History][dbo].[#hist]'')
AND OBJECTPROPERTY(id, N''IsUserTable'') = 1)
CREATE TABLE [History].[dbo].[#hist] ( kulcs int, szoveg varchar(40), modtip varchar(40), datum datetime default getdate())'
Exec sp_executesql #DynamicSql1, N'#hist nvarchar(40)', #hist=#hist
set #DynamicSql2 = N'INSERT INTO [History].[dbo].[#hist] (kulcs, szoveg, modtip)
SELECT kulcs, szoveg, ''delete''
FROM DELETED
INSERT INTO [History].[dbo].[#hist] (kulcs, szoveg, modtip)
SELECT kulcs, szoveg, ''insert''
FROM INSERTED'
Exec sp_executesql #DynamicSql2, N'#hist nvarchar(40)', #hist=#hist
Thanks for the answers in advance.
Dynamic sql is executed in his own scope, so you can't acces inserted/deleted objects.
You could write a SQLCLR trigger in C# look this example SQLCLR Trigger
but I think the easiest way is to use a temp table to write changes to, so the dynamic part is fixed.
Take a look:
DROP TRIGGER [test_history]
GO
CREATE TRIGGER [test_history] ON [test_table]
FOR DELETE, INSERT, UPDATE
AS
BEGIN
declare #date datetime = getdate()
declare #guid uniqueidentifier = newid()
declare #hist nvarchar(40)= 'test_history_' + CAST(YEAR(#date ) as VARCHAR(4))+ '_' + right('0' + CAST(Month(#date) as VARCHAR(2)), 2)
DECLARE #T1 BIT = 0
SELECT top 1 #T1 = 1 FROM sys.tables WHERE [TYPE] = 'U' AND name = 'test_history_9999_99'
IF #T1 = 1 TRUNCATE table test_history_9999_99
DECLARE #T2 BIT = 0
SELECT top 1 #T2 = 1 FROM sys.tables WHERE [TYPE] = 'U' AND name = #hist
IF #T1=0 BEGIN
SELECT ID, [desc], #date DATE_TIME, cast('delete' as varchar(20)) as operation, CAST(#guid AS varchar(64)) BATCH
INTO test_history_9999_99
FROM DELETED
END else begin
INSERT INTO test_history_9999_99
SELECT ID, [desc], #date, cast('delete' as varchar(20)) as operation, CAST(#guid AS varchar(64)) BATCH
FROM DELETED
end
INSERT INTO test_history_9999_99
SELECT ID, [desc], #date, cast('insert' as varchar(20)) as operation, CAST(#guid AS varchar(64)) BATCH
FROM inserted
IF #T2 = 0 BEGIN
EXEC sp_rename 'test_history_9999_99', #hist
END ELSE BEGIN
declare #DynamicSql nvarchar(2000)
SET #DynamicSql = 'INSERT INTO ' + #hist + ' SELECT * FROM test_history_9999_99;'
Exec sp_executesql #DynamicSql
END
END
My test_table contains only two columns ID and [Desc].
In the history tables I have added a DATETIME column with change date and a UNIQUEIDENTIFIER column so you can group all changes in a batch if you INSERT/UPDATE many records with a single operation
Tanks for the answer #MtwStark. Now it works, I can check if the table exists, and create it if not. And have eaccess to the DELETED and INSERTED tables.
I'm not sure, if in your solution I have to create the test_history_9999_99 table in advance. Because when I used your trigger I've got an error about column insertion (I didn't understand the error completly).
Now my code looks like this. I'm not sure if it can handle INSERT/UPDATE many records with a single operation. Probably I still need to insert this code for it? CAST(#guid AS varchar(64)) BATCH . I'm not sure what it really does, I have to look into it deeper.
CREATE TRIGGER [dbo].[teszttablatrigger] ON [teszt].[dbo].[teszt] FOR DELETE, INSERT, UPDATE AS
declare #hist nvarchar(40)
set #hist='teszthistory_' + CAST(YEAR(getdate()) as NCHAR(4))+ '_' + (case when Month(GETDATE())<10 then '0' + CAST (Month(GETDATE()) as NCHAR(1))
when Month(GETDATE())>=10 then CAST (Month(GETDATE()) as NCHAR(2)) end)
select * into #ins from inserted
select * into #del from deleted
declare #DynamicSql nvarchar(2000)
DECLARE #T2 BIT = 0
SELECT top 1 #T2 = 1 FROM sys.tables WHERE [TYPE] = 'U' AND name = #hist
if #T2=0 begin
set #DynamicSql = N'CREATE TABLE [' + #hist + '] ( kulcs int, szoveg varchar(40), modtip varchar(40), datum datetime default getdate())'
Exec sp_executesql #DynamicSql
end
set #DynamicSql = N'INSERT INTO ' + #hist + ' (kulcs, szoveg, modtip)
SELECT kulcs, szoveg, ''delete''
FROM #del
INSERT INTO ' + #hist + ' (kulcs, szoveg, modtip)
SELECT kulcs, szoveg, ''insert''
FROM #ins'
Exec sp_executesql #DynamicSql
Try refreshing intellisense. Ctrl+Shift+R see if that might help. Or do a database table refresh.
If you have SQL Server enterprise (check your version) Then better way will be to enable CDC.
https://msdn.microsoft.com/en-us/library/cc645937(v=sql.110).aspx

How to get a list of all changed tables from SQL Server Change Tracking

How to get a list of all tables (which already have Change Tracking enabled) which have any tracked changes after given version?
This will return a list of all the tables that have changed since the previous tracking version:
set nocount on;
-- We want to check for changes since the previous version
--declare #prevTrackingVersion int = INSERT_YOUR_PREV_VERSION_HERE
-- Comment out this line if you know the previous version
declare #prevTrackingVersion int = CHANGE_TRACKING_CURRENT_VERSION() - 1
-- Get a list of table with change tracking enabled
declare #trackedTables as table (name nvarchar(1000));
insert into #trackedTables (name)
select sys.tables.name from sys.change_tracking_tables
join sys.tables ON tables.object_id = change_tracking_tables.object_id
-- This will be the list of tables with changes
declare #changedTables as table (name nvarchar(1000));
-- For each table name in tracked tables
declare #tableName nvarchar(1000)
while exists(select top 1 * from #trackedTables)
begin
-- Set the current table name
set #tableName = (select top 1 name from #trackedTables order by name asc);
-- Determine if the table has changed since the previous version
declare #sql nvarchar(250)
declare #retVal int
set #sql = 'select #retVal = count(*) from changetable(changes ' + #tableName + ', ' + cast(#prevTrackingVersion as varchar) + ') as changedTable'
exec sp_executesql #sql, N'#retVal int output', #retVal output
if #retval > 0
begin
insert into #changedTables (name) select #tableName
end
-- Delete the current table name
delete from #trackedTables where name = #tableName;
end
select * from #changedTables;
Well to get a list of all tables that have change tracking enabled you would perform a query like
SELECT sys.tables.name FROM sys.change_tracking_tables
JOIN sys.tables ON tables.object_id = change_tracking_tables.object_id
Then you can add a where condition for the version if you'd like to. I believe that answers your question.
Also if you'd like to see some info on the change you can run a query like the one below for a specific table using the changetable function.
DECLARE #synchronization_version NVARCHAR(MAX),#last_synchronization_version NVARCHAR(MAX)
SET #synchronization_version = CHANGE_TRACKING_CURRENT_VERSION();
SELECT
CT.*
FROM
CHANGETABLE(CHANGES Sales.CreditCard, #last_synchronization_version) AS CT
UPDATE
I updated the original query to perform a look and print the results, you'l be able to review the tables before you exec the query since you have over 1000 tables per your comment you might want to remove some.
SET NOCOUNT ON;
DECLARE #Views as TABLE (name nvarchar(200));
INSERT INTO #Views (name)
SELECT sys.tables.name FROM sys.change_tracking_tables
JOIN sys.tables ON tables.object_id = change_tracking_tables.object_id
DECLARE #viewName nvarchar(200) = (select top 1 name from #Views);
DECLARE #sql nvarchar(max) = '';
DECLARE #union NVARCHAR(20)
DECLARE #sql1 NVARCHAR(max)
SET #sql1 = 'DECLARE #synchronization_version NVARCHAR(MAX),#last_synchronization_versionNVARCHAR(MAX)
SET #synchronization_version = CHANGE_TRACKING_CURRENT_VERSION();'
PRINT(#sql1)
WHILE(Exists(select 1 from #Views)) BEGIN
SET #union = '';
SET #sql = '
SELECT
CT.*
FROM
CHANGETABLE(CHANGES ' + #ViewName +', #last_synchronization_version) AS CT'
IF (SELECT COUNT(name) FROM #Views) > 2
BEGIN
SET #union = ' UNION'
END
Print (#sql+#union);
DELETE FROM #Views where name = #viewName;
SET #ViewName = (select top 1 name from #Views);
END;

Insert/update based on dynamic XML

I have following XML:
<NewDataSet>
<Data>
<Id>560f05b2-b215-4fea-9ac6-7f012fbca331</Id>
<Number>384D25334E04593B6DE9955E72F413F8A0A828FF</Number>
<CurrentDate>2012-11-21T09:09:26+00:00</CurrentDate>
</Data>
<Data>
<Id>9cff574b-59ea-4cbd-a2db-9ed02b6cc602</Id>
<Number>384D25334E04593B6DE9955E72F413F8A0A828FF</Number>
<Location>Town</Location>
<CurrentDate>2012-11-21T09:09:53+00:00</CurrentDate>
</Data>
</NewDataSet>
I'm trying to write a query that will inster new record or update existing one based on given xml. The problem is I cannot use predefined names of columns, becuase table straucture sometimes is changing. So the idea is to generate dynamic query and apply it. So far I've got following thing:
SET NOCOUNT OFF;
DECLARE #TableName nvarchar(50)
DECLARE #TableData xml
DECLARE #Query nvarchar(max)
DECLARE #Id uniqueidentifier
DECLARE #CurrentDate datetime
-- declare cursor
DECLARE cursor_inserting CURSOR LOCAL FAST_FORWARD FOR
SELECT
r.value('fn:local-name(.)', 'nvarchar(50)'),
r.query('.')
FROM #Data.nodes('//NewDataSet/*') AS records(r)
ORDER BY r.value('fn:local-name(.)', 'nvarchar(50)')
-- open cursor
OPEN cursor_inserting
FETCH NEXT FROM cursor_inserting INTO #TableName, #TableData
WHILE ##FETCH_STATUS = 0
BEGIN
-- Get id
SELECT #Id = o.value('Id[1]', 'uniqueidentifier') FROM #TableData.nodes('*') as n(o)
SELECT #CurrentDate = o.value('CurrentDate[1]', 'datetime') FROM #TableData.nodes('*') as n(o)
SET #Query = NULL
-- temporary update query
SET #UpdateTemp = NULL
SELECT #UpdateTemp = COALESCE(#UpdateTemp + ', ', '') + o.value('fn:local-name(.)', 'nvarchar(50)') + ' = ''' + CAST(o.query('text()') as nvarchar(4000)) + '''' FROM #TableData.nodes('/*/*') as n(o)
SET #UpdateTemp = 'UPDATE ' + #TableName + ' SET ' + #UpdateTemp + ' WHERE Id = ''' + CAST(#Id as nvarchar(40)) + ''''
-- temporary insert query
SET #Insert1Temp = NULL
SELECT #Insert1Temp = COALESCE(#Insert1Temp + ', ', '') + o.value('fn:local-name(.)', 'nvarchar(50)') FROM #TableData.nodes('/*/*') as n(o)
SET #Insert2Temp = NULL
SELECT #Insert2Temp = COALESCE(#Insert2Temp + ', ', '') + '''' + CAST(o.query('text()') as nvarchar(4000)) + '''' FROM #TableData.nodes('/*/*') as n(o)
SET #InsertTemp = 'INSERT INTO ' + #TableName + ' ( ' + #Insert1Temp + ' ) VALUES ( ' + #Insert2Temp + ' )'
IF #TableName = 'Data'
BEGIN
IF EXISTS (SELECT * FROM Data WHERE Id = #Id)
BEGIN
IF EXISTS (SELECT * FROM tblAudit WHERE Id = #Id AND CurrentDate < #CurrentDate)
BEGIN
SET #Query = #UpdateTemp
END
END
ELSE
BEGIN
SET #Query = #InsertTemp
END
END
IF #Query IS NOT NULL
BEGIN
SELECT #Query
EXEC (#Query)
END
END
FETCH NEXT FROM cursor_inserting INTO #TableName, #TableData
END
CLOSE cursor_inserting
DEALLOCATE cursor_inserting
If there is any better way to achive this inside SQL I would like to know, I know that I can do this outside SQL in my application code, but I would like to have it in one place in stored procedure to provide xml and have required action taken.
UPDATE 1
I would like to clarify that my main problem is query proper generation based on XML. The different way of handling instert/update is nice to see, but as addition
UPDATE 2
There can be more than 1 table in xml. E.g. not only Data but also Data2
UPDATE 3
I've update what I have now - and it is now generating proper Insert/Update however I now have issues with conversion. E.g. Date string is in xml format and sql doesn't want to convert it automaticaly. So my next step is get proper column type from database and instead of generate query instert directly from xml.I hope this will work.
Yes.
You can use MERGE and SQL XQuery to do it in one statement.
Something like...
merge Data as target
using
(
select
x.q.value('Id[1]','uniqueidentifier') as ID,
x.q.value('Number[1]','varchar(50)') as Number,
x.q.value('Location[1]','varchar(50)') as Town,
x.q.value('CurrentDate[1]','datetime') as CurrentDate
from
#TableData.nodes('/NewDataSet/Data')x(q)
) as Source (ID,Number,Town,CurrentDate)
on target.id=source.id
when matched and target.CurrentDate < source.CurrentDate then
update set
Number = source.number,
town = source.town,
currentdate = source.currentdate
when not matched then
insert (ID,number,town,currentdate)
values (source.id,source.number,source.town,source.currentdate);

MERGE without specifying column names in SQL Server 2008

I was looking at the MERGE command which seems cool but still it requires the columns to be specified. I'm looking for something like:
MERGE INTO target AS t
USING source AS s
WHEN MATCHED THEN
UPDATE SET
[all t.fields = s.fields]
WHEN NOT MATCHED THEN
INSERT ([all fields])
VALUES ([all s.fields])
Is it possible?
I'm lazy... this is a cheap proc I wrote that will spit out a general MERGE command for a table. It queries information_schema.columns for column names. I ripped out my source database name - so, you have to update the proc to work with your database (look for #SourceDB... I said it was cheap.) Anyway, I know others could write it much better - it served my purpose well. (It makes a couple assumptions that you could put logic in to handle - namely turning IDENTITY_INSERT OFF - even when a table doesn't have identity columns.) It updates the table in your current context. It was written against sql server 2008 to sync up some tables. Use at your own risk, of course.
CREATE PROCEDURE [dbo].[GenerateMergeSQL]
#TableName varchar(100)
AS
BEGIN
SET NOCOUNT ON
declare #sql varchar(5000),#SourceInsertColumns varchar(5000),#DestInsertColumns varchar(5000),#UpdateClause varchar(5000)
declare #ColumnName varchar(100), #identityColName varchar(100)
declare #IsIdentity int,#IsComputed int, #Data_Type varchar(50)
declare #SourceDB as varchar(200)
-- source/dest i.e. 'instance.catalog.owner.' - table names will be appended to this
-- the destination is your current db context
set #SourceDB = '[mylinkedserver].catalog.myDBOwner.'
set #sql = ''
set #SourceInsertColumns = ''
set #DestInsertColumns = ''
set #UpdateClause = ''
set #ColumnName = ''
set #isIdentity = 0
set #IsComputed = 0
set #identityColName = ''
set #Data_Type = ''
DECLARE #ColNames CURSOR
SET #ColNames = CURSOR FOR
select column_name, COLUMNPROPERTY(object_id(TABLE_NAME), COLUMN_NAME, 'IsIdentity') as IsIdentity ,
COLUMNPROPERTY(object_id(TABLE_NAME), COLUMN_NAME, 'IsComputed') as IsComputed , DATA_TYPE
from information_schema.columns where table_name = #TableName order by ordinal_position
OPEN #ColNames
FETCH NEXT FROM #ColNames INTO #ColumnName, #isIdentity, #IsComputed, #DATA_TYPE
WHILE ##FETCH_STATUS = 0
BEGIN
if #IsComputed = 0 and #DATA_TYPE <> 'timestamp'
BEGIN
set #SourceInsertColumns = #SourceInsertColumns +
case when #SourceInsertColumns = '' THEN '' ELSE ',' end +
'S.' + #ColumnName
set #DestInsertColumns = #DestInsertColumns +
case when #DestInsertColumns = '' THEN '' ELSE ',' end +
#ColumnName
if #isIdentity = 0
BEGIN
set #UpdateClause = #UpdateClause +
case when #UpdateClause = '' THEN '' ELSE ',' end
+ #ColumnName + ' = ' + 'S.' + #ColumnName + char(10)
END
if #isIdentity = 1 set #identityColName = #ColumnName
END
FETCH NEXT FROM #ColNames INTO #ColumnName, #isIdentity, #IsComputed, #DATA_TYPE
END
CLOSE #ColNames
DEALLOCATE #ColNames
SET #sql = 'SET IDENTITY_INSERT ' + #TableName + ' ON;
MERGE ' + #TableName + ' AS D
USING ' + #SourceDB + #TableName + ' AS S
ON (D.' + #identityColName + ' = S.' + #identityColName + ')
WHEN NOT MATCHED BY TARGET
THEN INSERT(' + #DestInsertColumns + ')
VALUES(' + #SourceInsertColumns + ')
WHEN MATCHED
THEN UPDATE SET
' + #UpdateClause + '
WHEN NOT MATCHED BY SOURCE
THEN DELETE
OUTPUT $action, Inserted.*, Deleted.*;
SET IDENTITY_INSERT ' + #TableName + ' OFF'
Print #SQL
END
Not everything you wanted, but partially:
WHEN NOT MATCHED THEN
INSERT([all fields])
VALUES (field1, field2, ...)
(The values list has to be complete, and match the order of the fields in your table's definition.)
Simple alternative to merge without naming any fields or having to update statement whenever table design changes. This is uni-directional from source to target, but it can be made bi-directional. Only acts on changed records, so it is very fast even with linked servers on slower connection.
--Two statement run as transaction batch
DELETE
C
FROM
productschina C
JOIN
(select * from productschina c except select * from productsus) z
on c.productid=z.productid
INSERT into productschina select * from productsus except select * from productschina
Here is code to setup tables to test above:
--Create a target table
--drop table ProductsUS
CREATE TABLE ProductsUS
(
ProductID INT PRIMARY KEY,
ProductName VARCHAR(100),
Rate MONEY
)
GO
--Insert records into target table
INSERT INTO ProductsUS
VALUES
(1, 'Tea', 10.00),
(2, 'Coffee', 20.00),
(3, 'Muffin', 30.00),
(4, 'Biscuit', 40.00)
GO
--Create source table
--drop table productschina
CREATE TABLE ProductsChina
(
ProductID INT PRIMARY KEY,
ProductName VARCHAR(100),
Rate MONEY
)
GO
--Insert records into source table
INSERT INTO ProductsChina
VALUES
(1, 'Tea', 10.00),
(2, 'Coffee', 25.00),
(3, 'Muffin', 35.00),
(5, 'Pizza', 60.00)
GO
SELECT * FROM ProductsUS
SELECT * FROM ProductsChina
GO
I think this answer deserves a little more love. It's simple, elegant and works. However, depending on the tables in question, it may be a little bit slow because the except clause is evaluating every column.
I suspect you can save a little bit of time by just joining on the primary key and the last modified date (if one exists).
DELETE
C
FROM
productschina C
JOIN
(select primary_key, last_mod_date from productschina c except select primary_key, last_mod_date from productsus) z
on c.productid=z.productid
INSERT into productschina select * from productsus except select * from productschina

Resources