I would like to know how I can switch from one database to another within the same script. I have a script that reads the header information from a SQL Server .BAK file and loads the information into a test database. Once the information is in the temp table (Test database) I run the following script to get the database name.
This part works fine.
INSERT INTO #HeaderInfo EXEC('RESTORE HEADERONLY
FROM DISK = N''I:\TEST\database.bak''
WITH NOUNLOAD')
DECLARE #databasename varchar(128);
SET #databasename = (SELECT DatabaseName FROM #HeaderInfo);
The problem is when I try to run the following script nothing happens. The new database is never selected and the script is still on the test database.
EXEC ('USE '+ #databasename)
The goal is switch to the new database (USE NewDatabase) so that the other part of my script (DBCC CHECKDB) can run. This script checks the integrity of the database and saves the results to a temp table.
What am I doing wrong?
You can't expect a use statement to work in this fashion using dynamic SQL. Dynamic SQL is run in its own context, so as soon as it has executed, you're back to your original context. This means that you'd have to include your SQL statements in the same dynamic SQL execution, such as:
declare #db sysname = 'tempdb';
exec ('use ' + #db + '; dbcc checkdb;')
You can alternatively use fully qualified names for your DB objects and specify the database name in your dbcc command, even with a variable, as in:
declare #db sysname = 'tempdb';
dbcc checkdb (#db);
You can't do this because Exec scope is limited to dynamic query. When exec ends context is returned to original state. But context changes in Exec itself. So you should do your thing in one big dynamic statement like:
DECLARE #str NVARCHAR(MAX)
SET #str = 'select * from table1
USE DatabaseName
select * from table2'
EXEC (#str)
Related
I needed Msforeach_table stored procedure which depends upon sys.MSforeach_worker (System) stored procedure.
I am following this source code to create stored procedure MSforeach_worker
The syntax here is for dbo and not for sys so I have changed it to sys.MSforeach_worker from dbo.MSforeach_worker
When I try to create in my Databases, i get this error
The specified schema name "sys" either does not exist or you do not
have permission to use it
And when I try to create it in master db, I get
CREATE PROCEDURE permission denied in database 'master'
I am confused where should I run this script to create System stored procedure in my SQL server.
I have googled but could not find solution to my problem.
First, don't use undocumented system stored procedures. These are not supported.
Second, if these undocumented procs don't already exist, you must be using Azure SQL Database. Azure SQL Database has a significantly different architecture with regards to separation of master and user databases. Rather than trying to port the procs, I suggest you create your own proc with the functionality you need. Below is an example.
CREATE PROC dbo.usp_ForEachTable
#SQL nvarchar(MAX)
AS
DECLARE
#SQLBatch nvarchar(MAX)
, #TableName nvarchar(261);
DECLARE tables CURSOR LOCAL FAST_FORWARD FOR
SELECT QUOTENAME(OBJECT_SCHEMA_NAME(object_id)) + '.' + QUOTENAME(name)
FROM sys.tables
WHERE is_ms_shipped = 0;
OPEN tables;
WHILE 1 = 1
BEGIN
FETCH NEXT FROM tables INTO #TableName;
IF ##FETCH_STATUS = -1 BREAK;
SET #SQLBatch = REPLACE(#SQL, N'?', #TableName);
EXEC sp_executesql #SQLBatch;
END;
CLOSE tables;
DEALLOCATE tables;
GO
I need to TRIM databases as per requirement. So, I'm using below script and giving database names manually. All I need is to automate the script to get database names automatically. Can anyone please suggest how to get the database name automatically.
Use [Sales_backup_2015_05_31_230001_7137975]
Exec [spMaint_TrimTestDB] 1
Go
for Eg:
instead of giving manually Sales_backup_2015_05_31_230001_7137975 I need to get db name automatically
Thanks.
There is a function DB_NAME() that would return the name of the current database if no parameters are passed. Check this.
I guess dynamic SQL might help you to run SP in different databases:
DECLARE #sql nvarchar(max)
SELECT #sql = (
SELECT N'Use '+QUOTENAME([name]) +' Exec [spMaint_TrimTestDB] 1;'
FROM sys.databases
WHERE database_id >= 5 AND [name] like 'Sales_backup%'
FOR XML PATH('')
)
EXEC sp_executesql #sql
This script will create and execute dynamic statement like:
Use [sales_backup_2015] Exec [spMaint_TrimTestDB] 1;
Use [sales_backup_2016] Exec [spMaint_TrimTestDB] 1;
etc...
I am using a FileTable in SQL Server 2014 and need to run an executable that parses the file name of any inserted/updated/deleted file and then in turn the executable inserts into other tables on the database the information that was parsed from the name. I do not expect the .exe to run long at all but if it runs into issues, I do not want to lock it for an extended period of time.
For instance:
CREATE PROCEDURE filename_parser
#name nvarchar(255)
AS
BEGIN
DECLARE #exe nvarchar(255)
SET #exe = 'c:\test\my.exe "' + #name + '"'
EXEC master..xp_cmdshell #exe
END
GO
If I run the stored procedure from an INSERT or UPDATE trigger, for instance:
USE [db_1]
GO
CREATE TRIGGER [dbo].[i_table_a]
ON
[dbo].[table_a]
AFTER
INSERT
AS
DECLARE #file nvarchar(255)
SELECT TOP 1
#file = name
FROM
inserted
EXEC filename_parser #name = #file
will I end up locking table_a until the executable completes? Sorry, if the answer is obvious. I have not found a straight forward answer. Any help/pointing in the appropriate direction is appreciated.
Related links:
Do stored procedures lock tables/rows?
SQL Server - How to lock a table until a stored procedure finishes
Microsoft docs say xp_cmdshell will run synchronously. Triggers run synchronously too. So, if your exe gets stuck, it will hang the trigger, which will hang the insert, and other stuff. msdn.microsoft.com/en-us/library/ms175046.aspx#remarks
I'm trying to create a dynamic database creation script.
There are a lot of steps and we create this database often so the script looks something like this.
DECLARE #databaseName nvarchar(100) = 'DatabaseName'
EXEC('/*A lot of database creation code built off of #databaseName*/')
This is all well and good except for one view that we'd like to create in #databaseName.
I've tried four different ways to create this view without success:
My first thought was to simply set the database context and then create the view in one script. Unfortunately, this didn't work because CREATE VIEW must be the first statement in its query block (details).
--Result: Error message, "'CREATE VIEW' must be the first statement in a query batch"
EXEC
('
USE [' + #databaseName + ']
CREATE VIEW
')
To get around (1) I tried to set the context separately so that CREATE VIEW would be the first command in the EXEC. This did create the view but did so within my current context and not #databaseName. It seem that the effects of calling USE in EXEC only persist until the end of that EXEC statement (details).
--Result: The view is created in the currently active database rather than #databaseName
EXEC ('USE [' + #databaseName + ']')
EXEC ('CREATE VIEW')
Next I tried putting everything back into one script but included a GO command in order to make CREATE VIEW the first command in a new query block. This failed because GO isn't allowed within an EXEC script (details).
--Result: Error message, "Incorrect syntax near 'GO'"
EXEC
('
USE [' + #databaseName + ']
GO
CREATE VIEW
')
Finally I tried to specify the target database as part of the CREATE VIEW command. In this case the script failed because CREATE VIEW doesn't allow the database to be specified as part of its creation (details).
--Result: Error message, "'CREATE/ALTER VIEW' does not allow specifying the database name as a prefix to the object name"
EXEC ('CREATE VIEW [' + #databaseName + '].[dbo].[ViewName]')
Any suggestions? I think this should be a common use case but Google wasn't able to help me.
You can do this by double nesting the dynamic SQL statements then:
begin tran
declare #sql nvarchar(max) =
N'use [AdventureWorks2012];
exec (''create view Test as select * from sys.databases'')';
exec (#sql);
select * from AdventureWorks2012.sys.views
where name = 'Test'
rollback tran
Instead of double nesting, another approach is to create a stored procedure whose only purpose is to executes dynamic SQL
CREATE PROCEDURE [dbo].[util_CreateViewWithDynamicSQL]
#sql nvarchar(max)
AS
BEGIN
SET NOCOUNT ON;
EXECUTE (#sql)
END
The stored procedure above can be re-used. Anytime you need to create a view just call the stored procedure and pass it the dynamic sql.
EXECUTE util_CreateViewWithDynamicSQL 'create view Test as select * from sys.databases'
I prefer this approach because dynamic sql is confusing enough and adding double nesting complicates it further.
I have two different SQL Server databases (on the same server - if it helps) that need to share the same stored procedure logic. The solution I'm trying to achieve looks like this:
Database1
Table: TestTable
Synonym: sp_MyProc pointing at SharedDatabase.dbo.sp_MyProc
Database2
Table: TestTable
Synonym: sp_MyProc pointing at SharedDatabase.dbo.sp_MyProc
SharedDatabase
Proc: sp_MyProc which runs queries against TestTable
My hope was to use the synonyms so that if I execute sp_MyProc while in the context of Database1, it would use Database2.TestTable. And if I execute sp_MyProc while in the context of Database2, it would go against Database2.TestTable. However, when I execute sp_MyProc through either of the synonyms, it ignores the context of the synonym and executes looking for a local copy of TestTable, which is not found.
Is there a way to implement a shared stored procedure that executes against different copies of tables in different databases, either through synonyms or some other mechanism?
Edit
I should mention that in my case I am looking to do this with a large set of existing tables and procs, so any solution that requires modifying the procs or tables themselves are not ideal.
Something like this would work for the definition of the procedure. Be sure to guard against SQL injection since this is built dynamically.
CREATE PROCEDURE [dbo].dosomething
#databaseName sysname,
#schema sysname,
#tableName sysname
as
declare #cmd as nvarchar(max)
set #cmd = N'select * from ' + quotename(#schema) + N'.' + quotename(#tableName)
exec sp_executesql #cmd
Then use it like this:
dosomething 'SampleDb', 'dbo', 'sampleTable'
If the stored proc is in the SharedDatabase, then it will always run in context of SharedDatabase. To accomplish what you are trying to do to centralize code, I would maybe pass in a parameter to designate which server it is coming from, so then you can execute the query against that specific TestTable. Basically, you will need to refer to each table using their fully qualified name - i.e. Database1.dbo.TestTable
USE SharedDatabase
CREATE PROCEDURE [dbo].sp_MyProc
#dbsource varchar(50)
as
if(#dbsource == 'DB1')
begin
select * from Database1.dbo.TestTable
end
else
begin
select * from Database2.dbo.TestTable
end
GO
The other alternative is to make a view in SharedDatabase, which will be called TestTableComposite, with an extra column to identify where the source data is. And then pass that in as the parameter, and your SP on SharedDatabase will always be in context of that DB.