I'm trying to dynamically create triggers, but ran into a confusing issue around using sp_executesql and passing parameters into the dynamic SQL. The following simple test case works:
DECLARE #tableName sysname = 'MyTable';
DECLARE #sql nvarchar(max) = N'
CREATE TRIGGER TR_' + #tableName + N' ON ' + #tableName + N' FOR INSERT
AS
BEGIN
PRINT 1
END';
EXEC sp_executesql #sql
However, I want to be able to use #tableName (and other values) as variables within the script, so I passed it along to the sp_executesql call:
DECLARE #tableName sysname = 'ContentItems';
DECLARE #sql nvarchar(max) = N'
CREATE TRIGGER TR_' + #tableName + N' ON ' + #tableName + N' FOR INSERT
AS
BEGIN
PRINT #tableName
END';
EXEC sp_executesql #sql, N'#tableName sysname', #tableName=#tableName
When running the above, I get an error:
Msg 156, Level 15, State 1, Line 2
Incorrect syntax near the keyword 'TRIGGER'.
After trying I few things, I've discovered that even if I don't use #tableName in the dynamic SQL at all, I still get this error. And I also get this error trying to create a PROCEDURE (except, obviously, the message is Incorrect syntax near the keyword 'PROCEDURE'.)
Since the SQL runs fine either directly or when not supplying parameters to sp_executesql, this seems like I'm running into a true limitation in the SQL engine, but I don't see it documented anywhere. Does anyone know if there is a way to accept to a dynamic CREATE script, or at least have insight into the underlying limitation that's being run into?
Update
I can add a PRINT statement, and get the below SQL, which is valid, and runs successfully (when run directly). I still get the error if there's nothing dynamic in the SQL (it's just a single string with no concatenation).
CREATE TRIGGER TR_ContentItems ON ContentItems FOR INSERT
AS
BEGIN
PRINT #tableName
END
I also get the same error whether using sysname or nvarchar(max) for the parameter.
If you execute your create trigger statement that you said you printed... you will find that it does not work. The print statement in the body of the trigger is trying to output #tablename, but is never defined, so you will get an error:
Must declare the scalar variable "#tableName".
But that is not your main issue. As for why you can't seem to execute a DDL statement with execute_sql with parameters, I couldn't find any documentation to explain why... but your experience and others proves that it's troublesome. I believe this post has a pretty good theory: sp_executesql adds statements to executed dynamic script?
You can however execute dynamic sql with DDL statements using the EXECUTE statement. So what you could do is create a parameterized sp_executesql statement that validates your table name and then creates a dynamic sql string to execute with the EXECUTE statement.
It doesn't look pretty, but it works:
DECLARE #tableName sysname = 'MyTable';
DECLARE #sql nvarchar(max) =
N'
set #tableName = (SELECT name FROM sys.tables WHERE OBJECT_ID = OBJECT_ID(#tableName)) --validate table
DECLARE #CreateTriggerSQL as varchar(max) =
''
CREATE TRIGGER '' + QUOTENAME(''TR_'' + #tableName) + '' ON '' + QUOTENAME( #tableName) + '' FOR INSERT
AS
BEGIN
PRINT '''''' + #tableName + ''''''
END
''
print isnull(#CreateTriggerSQL, ''INVALID TABLE'')
exec (#CreateTriggerSQL)
';
EXEC sp_executesql #sql, N'#tableName sysname', #tableName=#tableName;
You could also convert this into a stored procedure with parameters instead of running sp_executesql if that were more convenient. It looks a bit cleaner:
CREATE PROCEDURE sp_AddTriggerToTable (#TableName AS sysname) AS
set #tableName = (SELECT name FROM sys.tables WHERE OBJECT_ID = OBJECT_ID(#tableName)) --validate table
DECLARE #CreateTriggerSQL as varchar(max) =
'
CREATE TRIGGER ' + QUOTENAME('TR_' + #tableName) + ' ON ' + QUOTENAME( #tableName) + ' FOR INSERT
AS
BEGIN
PRINT ''' + #tableName + '''
END
'
print isnull(#CreateTriggerSQL, 'INVALID TABLE')
exec (#CreateTriggerSQL)
GO
I would strongly caution against using Dynamic SQL with table names. You are setting yourself up for some serious SQL Injection issues. You should validate anything that goes into the #tableName variable.
That said, in your example...
DECLARE #tableName sysname = 'ContentItems';
DECLARE #sql nvarchar(max) = N'
CREATE TRIGGER TR_' + #tableName + N' ON ' + #tableName + N' FOR INSERT
AS
BEGIN
PRINT #tableName
END';
EXEC sp_executesql #sql, N'#tableName sysname', #tableName=#tableName
... you are trying to input your declared #tableName into the text you're creating for #sql, and then you're trying to pass a parameter through spexecutesql. This makes your #sql invalid when trying to call it.
You can try:
DECLARE #tableName sysname = 'ContentItems';
DECLARE #sql nvarchar(max) = N'
CREATE TRIGGER TR_'' + #tableName + N'' ON '' + #tableName + N'' FOR INSERT
AS
BEGIN
PRINT #tableName
END';
EXEC sp_executesql #sql, N'#tableName sysname', #tableName=#tableName
... which will give you the string ...
'
CREATE TRIGGER TR_' + #tableName + N' ON ' + #tableName + N' FOR INSERT
AS
BEGIN
PRINT #tableName
END'
... which can then accept the parameter you pass through ...
EXEC sp_executesql #sql, N'#tableName sysname', #tableName=#tableName ;
Again, I'd use some heavy validation (and white-listing) before passing anything into dynamic SQL that will use a dynamic table name.
NOTE: As noted below, I believe you are limited on DML statements that can be executed with sp_executesql(), and I think parameterization is limited also. And based on your other comments, it doesn't sound like you're really needing a dynamic process but a way to repeat a specific task for a handful of elements. If that's the case, my recommendation is to do it manually with a copy/paste then execute the statements.
Since the SQL runs fine either directly or when not supplying
parameters to sp_executesql, this seems like I'm running into a true
limitation in the SQL engine, but I don't see it documented anywhere.
This behavior is documented, albeit not intuitive. The relevant excerpt from the documentation under the trigger limitations topic:
CREATE TRIGGER must be the first statement in the batch
When you execute a parameterized query, the parameter declarations are counted as being part of the batch. Consequently, a CREATE TRIGGER batch (and other CREATE statements for programmability objects like procs, functions, etc.) cannot be executed as a parameterized query.
The invalid syntax error message you get when you attempt to run CREATE TRIGGER as a parameterized query isn't particularly helpful. Below is an simplified version of your code using the undocumented and unsupported internal parameterized query syntax.
EXECUTE(N'(#tableName sysname = N''MyTable'')CREATE TRIGGER TR_MyTable ON dbo.MyTable FOR INSERT AS');
This at least yields an error calling out the CREATE TRIGGER limitation:
Msg 1050, Level 15, State 1, Line 73 This syntax is only allowed for
parameterized queries. Msg 111, Level 15, State 1, Line 73 'CREATE
TRIGGER' must be the first statement in a query batch.
Similarly executing another parameterized statement with this method runs successfully:
EXECUTE (N'(#tableName sysname = N''MyTable'')PRINT #tableName');
But if you don't actually use the parameter in the batch, an error results
EXECUTE (N'(#tableName sysname = N''MyTable'')PRINT ''done''');
Msg 1050, Level 15, State 1, Line 75 This syntax is only allowed for
parameterized queries.
The bottom line is that you need to build the CREATE TRIGGER statement as a string without parameters and execute the statement as a non-parameterized query to create a trigger.
Is it possible to issue CREATE statements using sp_executesql with
parameters?
Simple answer is "No", you can't
According to MSDN
Generally, parameters are valid only in Data Manipulation Language
(DML) statements, and not in Data Definition Language (DDL) statements
You can check more details about this Statement Parameters
What is the issue?
Parameters are only allowed in place of scalar literals, like quoted strings or dates, or numeric values. You can't parameterise a DDL operation.
What can be done?
I believe that you want to use parameterized sp_executesql is to avoid any SQL Injection Attack. To achieve this for the DDL operations you can do following thing to minimize the possibility of attack.
Use Delimiters : You can use QUOTENAME() for SYSNAME parameters like Trigger Name, Table Names and Column names.
Limiting Permissions : User Account you are using to run the dynamic DDL, should have only limited permission. Like on a
specific schema with only CREATE permission.
Hiding Error Message : Don't throw the actual error to the user. SQL Injection are mainly performed by trial and error approach. If
you hide the actual error message, it will become tough to crack it.
Input Validation : You can always have a function which validates the input string, escape the required characters, check
for specific keywords like DROP.
Any workaround?
If you want to parameterized your statement using sp_executesql, in that case you can get the query to be executed in a OUTPUT variable and run the query in next statement like following.
By this, the first call to sp_executesql will parameterized your query, and the actual execution will be performed by the second call to sp_executesql
For example.
DECLARE #TableName VARCHAR(100) = 'MyTable'
DECLARE #returnStatement NVARCHAR(max);
DECLARE #sql1 NVARCHAR(max)=
N'SELECT #returnStatement = ''CREATE TRIGGER TR_''
+ #TableName + '' ON '' + #TableName + '' FOR INSERT AS BEGIN PRINT 1 END'''
EXEC Sp_executesql
#sql1,
N'#returnStatement VARCHAR(MAX) OUTPUT, #TableName VARCHAR(100)',
#returnStatement output,
#TableName
EXEC Sp_executesql #returnStatement
Is it possible to issue CREATE statements using sp_executesql with
parameters?
The answer is "Yes", but with small adjustment:
USE msdb
DECLARE #tableName sysname = 'sysjobsteps';
DECLARE #sql nvarchar(max) = N'
EXECUTE ('' -- Added nested EXECUTE()
CREATE TRIGGER [TR_'' + #tableName + N''] ON ['' + #tableName + N''] FOR INSERT
AS
BEGIN
PRINT '''''+#tableName+'''''
END''
)' -- End of EXECUTE()
EXEC sp_executesql #sql, N'#tableName sysname', #tableName=#tableName
Adjsutments list:
Extra EXECUTE involved, comment below explains why
Extra square brackets added to make SQL Injections slightly harder
I'm looking for specific (ideally, documented) restrictions of
sp_executesql with parameters and if there are any workarounds for
those specific restrictions (beyond not using parameters)
in this case it is a limitation of DDL commands, not sp_executesql. DDL statements cannot be parametrized using variables. Microsoft documentation says:
Variables can be used only in expressions, not in place of object
names or keywords. To construct dynamic SQL statements, use EXECUTE.
source: DECLARE (Transact-SQL)
Therefore, the solution with EXECUTE is provided by me as a workaround
Personally I hate triggers and try to avoid them most of the time ;)
However if you really, really need this dynamic stuff you should use sp_MSforeachtable and avoid injection (as pointed out by Shawn) at any cost:
EXEC sys.sp_MSforeachtable
#command1 = '
DECLARE #sql NVARCHAR(MAX)
SET #sql = CONCAT(''CREATE TRIGGER TR_''
, REPLACE(REPLACE(REPLACE(''?'', ''[dbo].'', ''''),''['',''''),'']'','''')
, '' ON ? FOR INSERT
AS
BEGIN
PRINT ''''?'''';
END;'');
EXEC sp_executesql #sql;'
, #whereand = ' AND object_id IN (SELECT object_id FROM sys.objects
WHERE name LIKE ''%ContentItems%'')';
If you want to use the parameter as string, add double ' before and after the parameter name
like this :
DECLARE #tableName sysname = 'ContentItems';
DECLARE #sql nvarchar(max) = N'
CREATE TRIGGER TR_' + #tableName + N' ON ' + #tableName + N' FOR INSERT
AS
BEGIN
print ''' + #tableName
+''' END';
EXEC sp_executesql #sql
And if you want to use it as table name, use select instead of print ,
like this :
DECLARE #tableName sysname = 'ContentItems';
DECLARE #sql nvarchar(max) = N'
CREATE TRIGGER TR_' + #tableName + N' ON ' + #tableName + N' FOR INSERT
AS
BEGIN
select * from ' + #tableName
+' END';
EXEC sp_executesql #sql
I wonder, is it possible to make this work?
declare #SQL varchar(max),
#query varchar(max) ='select * from Table'
select #SQL='USE LinkedServer.DBName ' + #query'
exec (#SQL)
It works just fine if you use 'USE' to run queries on the same server, but not on Linked ones.
What I want is to create procedure which will run some dynamic queries on different servers and dbs, and I want to pass ServerName and DBName as a parameter to that sp.
NOTE: I don't want to use it this way:
declare #SQL varchar(max)
select #SQL='
select * from LinkedServer.DBName..Table'
exec (#SQL)
Since that select statement will be retrieved from table
Sure, little-known trick to execute the remote database's sp_executesql:
DECLARE #LinkedServer sysname = N'LinkedServer',
#DatabaseName sysname = N'DBName';
-- above are input params
DECLARE #exec nvarchar(4000), #sql nvarchar(max);
SET #sql = N'SELECT * FROM dbo.Table;';
-- or SELECT #sql = DynamicSQL FROM dbo.SomeLocalTable WHERE...
SET #exec = QUOTENAME(#LinkedServer) + N'.'
+ QUOTENAME(#DatabaseName)
+ N'.sys.sp_executesql';
EXEC #exec #sql;
Don't use parentheses. EXEC('string') is evil, evil, evil.
I need to define the database as a variable in a query like this:
insert into Database1.dbo.Table1
select * from Database2.dbo.Table1
In my case all the databases have the same schema.
I can do with dynamic sql of course, but is it a way to have a syntax like:
insert into Database1.dbo.Table1
select * from #ChosenDatabase.dbo.Table1
?
Thanks
What you are trying to achieve is not possible; You have to use Dynamic SQL :
DECLARE #SQLQuery varchar(300)
DECLARE #TableName varchar(100)
SET #TableName = ChosenDatabase.dbo.Table1
SET #SQLQuery = 'INSERT INTO Database1.dbo.Table1 SELECT * FROM ' + #TableName
EXEC(#SQLQuery);
Declare #dbname varchar(200)
Declare #sql varchar(1000)
now comes dynamic sql
set #sql='select * from '#dbname+'dbo.yourtable';
Try to use dynamic sql
declare #ChosenDatabase varchar(100)='dbname'
DECLARE #SQL NVARCHAR(MAX)
SET #SQL= 'select * from '+ #ChosenDatabase+'.dbo.Customer'
--SELECT #SQL
EXEC SP_EXECUTESQL #SQL
I am very new to SQL. I want to access a variable dynamically in a select statement.
declare #sql NVARCHAR(MAX)
declare #tableName varchar(100)
set #tableName='xxxx'
set #sql='select * from ' +#tableName+
EXEC sys.sp_executesql #sql
But every time I am executing the above query I am getting an error:
Incorrect syntax near the keyword 'EXEC'.
declare #sql NVARCHAR(MAX);
declare #tableName NVARCHAR(128);
set #tableName='xxxx';
SET #sql = N'select * from ' + QUOTENAME(#tableName)
EXECUTE sp_executesql #sql
Use QUOTENAME() Function when concertinaing passed variables from users to your dynamic sql. It protects you against possible sql injection attack.
You've got an extra plus sign after #tableName - remove it:
set #sql='select * from ' +#tableName /*+ */
EXEC sys.sp_executesql #sql
You had one too many plus(+) signs around #table name.
Please note that this method is wide open to an injection attack though.
declare #sql NVARCHAR(MAX)
declare #tableName varchar(100)
set #tableName='xxxx'
set #sql='select * from ' +#tableName
EXEC sys.sp_executesql #sql
I'm trying to switch the current database with a SQL statement.
I have tried the following, but all attempts failed:
USE #DatabaseName
EXEC sp_sqlexec #Sql -- where #Sql = 'USE [' + #DatabaseName + ']'
To add a little more detail.
EDIT: I would like to perform several things on two separate database, where both are configured with a variable. Something like this:
USE Database1
SELECT * FROM Table1
USE Database2
SELECT * FROM Table2
The problem with the former is that what you're doing is USE 'myDB' rather than USE myDB.
you're passing a string; but USE is looking for an explicit reference.
The latter example works for me.
declare #sql varchar(20)
select #sql = 'USE myDb'
EXEC sp_sqlexec #Sql
-- also works
select #sql = 'USE [myDb]'
EXEC sp_sqlexec #Sql
exec sp_execsql #Sql
The DB change only lasts for the time to complete #sql
http://blog.sqlauthority.com/2007/07/02/sql-server-2005-comparison-sp_executesql-vs-executeexec/
I have the same problem, I overcame it with an ugly -- but useful -- set of GOTOs.
The reason I call the "script runner" before everything is that I want to hide the complexity and ugly approach from any developer that just wants to work with the actual script. At the same time, I can make sure that the script is run in the two (extensible to three and more) databases in the exact same way.
GOTO ScriptRunner
ScriptExecutes:
--------------------ACTUAL SCRIPT--------------------
-------- Will be executed in DB1 and in DB2 ---------
--TODO: Your script right here
------------------ACTUAL SCRIPT ENDS-----------------
GOTO ScriptReturns
ScriptRunner:
USE DB1
GOTO ScriptExecutes
ScriptReturns:
IF (db_name() = 'DB1')
BEGIN
USE DB2
GOTO ScriptExecutes
END
With this approach you get to keep your variables and SQL Server does not freak out if you happen to go over a DECLARE statement twice.
Just wanted to thank KM for his valuable solution.
I implemented it myself to reduce the amount of lines in a shrinkdatabase request on SQLServer.
Here is my SQL request if it can help anyone :
-- Declare the variable to be used
DECLARE #Query varchar (1000)
DECLARE #MyDBN varchar(11);
-- Initializing the #MyDBN variable (possible values : db1, db2, db3, ...)
SET #MyDBN = 'db1';
-- Creating the request to execute
SET #Query='use '+ #MyDBN +'; ALTER DATABASE '+ #MyDBN +' SET RECOVERY SIMPLE WITH NO_WAIT; DBCC SHRINKDATABASE ('+ #MyDBN +', 1, TRUNCATEONLY); ALTER DATABASE '+ #MyDBN +' SET RECOVERY FULL WITH NO_WAIT'
--
EXEC (#Query)
try this:
DECLARE #Query varchar(1000)
DECLARE #DatabaseName varchar(500)
SET #DatabaseName='xyz'
SET #Query='SELECT * FROM Server.'+#DatabaseName+'.Owner.Table1'
EXEC (#Query)
SET #DatabaseName='abc'
SET #Query='SELECT * FROM Server.'+#DatabaseName+'.Owner.Table2'
EXEC (#Query)
I case that someone need a solution for this, this is one:
if you use a dynamic USE statement all your query need to be dynamic, because it need to be everything in the same context.
You can try with SYNONYM, is basically an ALIAS to a specific Table, this SYNONYM is inserted into the sys.synonyms table so you have access to it from any context
Look this static statement:
CREATE SYNONYM MASTER_SCHEMACOLUMNS FOR Master.INFORMATION_SCHEMA.COLUMNS
SELECT * FROM MASTER_SCHEMACOLUMNS
Now dynamic:
DECLARE #SQL VARCHAR(200)
DECLARE #CATALOG VARCHAR(200) = 'Master'
IF EXISTS(SELECT * FROM sys.synonyms s WHERE s.name = 'CURRENT_SCHEMACOLUMNS')
BEGIN
DROP SYNONYM CURRENT_SCHEMACOLUMNS
END
SELECT #SQL = 'CREATE SYNONYM CURRENT_SCHEMACOLUMNS FOR '+ #CATALOG +'.INFORMATION_SCHEMA.COLUMNS';
EXEC sp_sqlexec #SQL
--Your not dynamic Code
SELECT * FROM CURRENT_SCHEMACOLUMNS
Now just change the value of #CATALOG and you will be able to list the same table but from different catalog.
If SQLCMD is an option, it supports scripting variables above and beyond what straight T-SQL can do. For example: http://msdn.microsoft.com/en-us/library/ms188714.aspx
You can do this:
Declare #dbName nvarchar(max);
SET #dbName = 'TESTDB';
Declare #SQL nvarchar(max);
select #SQL = 'USE ' + #dbName +'; {can put command(s) here}';
EXEC (#SQL);
{but not here!}
This means you can do a recursive select like the following:
Declare #dbName nvarchar(max);
SET #dbName = 'TESTDB';
Declare #SQL nvarchar(max);
SELECT #SQL = 'USE ' + #dbName + '; ' +(Select ... {query here}
For XML Path(''),Type)
.value('text()[1]','nvarchar(max)');
Exec (#SQL)
Use exec sp_execsql #Sql
Example
DECLARE #sql as nvarchar(100)
DECLARE #paraDOB datetime
SET #paraDOB = '1/1/1981'
SET #sql=N'SELECT * FROM EmpMast WHERE DOB >= #paraDOB'
exec sp_executesql #sql,N'#paraDOB datetime',#paraDOB
-- If you are using a variable for the database name.
-- Try something like this.
DECLARE #DBName varchar(50)
Set #DBName = 'Database1'; /* could be passed in by a parameter. */
IF( #DBName = 'Database1')
Begin
USE [Database1];
SELECT FROM Table1;
End
IF( #DBName = 'Database2')
Begin
USE [Database2];
SELECT FROM Table2;
End
IF( #DBName is null)
Begin
USE [Database1];
End