Change database context through variable - database

How can I use a variable as db context?
Create Procedure [dbo].[prName] (#dbname varchar(25)) as
begin
use master
some sql
<!-- I need to use master for some functions stored in master -->
use #dbname
exec('SELECT column_name FROM INFORMATION_SCHEMA.COLUMNS WHERE [TABLE_CATALOG] = '+#dbname+' and TABLE_NAME=table123')
end
GO
Thanks

I think you might want to use the following:
CREATE Procedure [dbo].[prName] (#dbname varchar(25)) as
begin
exec('select top 5 * from '+#dbname+'.yourSchema.yourTable')
end
The USE statement is not allowed in a stored procedure. If you are passing in the database name, then you do not need the USE statement, the database name will be included in your sql query.
Edit: Based on your edit that you need to access items in master, then all you need to do is execute your sql, specifying the need of master just use fully qualified sql.
CREATE Procedure [dbo].[prName] (#dbname varchar(25)) as
begin
select * from master.INFORMATION_SCHEMA.TABLES
exec('select top 5 * from '+#dbname+'.yourSchema.yourTable')
end

Related

Use trigger to modify column data type length

My program will create a temp table which will drop after the program executed. The data type length is 8. But I want to change the length to 15 when I run the program using the trigger function in Sql Server. I have few table that need to change the length. Is there any way to change the length without stating the table name in trigger function?
Clarification:
I have 100 programs which will create temporary table with different names. Each temp table will have user_id varchar(8). So i want to change the length to 15 . But i dont want to open my each program's source code to change it. is there a better way that you can suggest me?
What you want is essentially possible to achive using DDL triggers.
CREATE TRIGGER [TRG_TABLES]
ON DATABASE
AFTER
CREATE_TABLE
AS
BEGIN
SET NOCOUNT ON
DECLARE #TABLE_NAME SYSNAME
SELECT
#TABLE_NAME = EVENTDATA().value('(/EVENT_INSTANCE/ObjectName)[1]','SYSNAME')
IF EXISTS(SELECT * FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = #TABLE_NAME
AND COLUMN_NAME = 'TEST')
BEGIN
DECLARE #SQL as NVARCHAR(MAX) ='ALTER TABLE ' + #TABLE_NAME + ' ALTER COLUMN TEST NVARCHAR(200) '
Exec sp_ExecuteSql #SQL
END
END
GO
ENABLE TRIGGER [TRG_TABLES] ON DATABASE
You should be EXTRA careful about SQL injection if you use this approach.
EDIT: This is just a general idea you should probably figure out under which conditions you should alter the column - if there is a predictable pattern to your table names.

Moving table data to another table if table is not exists

I am generating dynamic script to move data into another database if table is not present then i want to create. This script runs perfectly if it execute directly. But gives error if that script string execute by execute statement. i have tried exec also.
declare #temp as varchar(max)
set #temp='select * into Allocation_Archive.dbo.Users from Users'
execute #temp
Error
Database 'select * into Allocation_Archive' does not exist. Make sure that the name is entered correctly.
Option 1
Add a USE database statement to your variable or specify the full object name including database and schema.
Wrap the #temp variable in parentheses
For example:
declare #temp as varchar(max)
set #temp='select * into Allocation_Archive.dbo.Users from ThisDatabase.dbo.Users'
execute #temp
Option 2
Don't use:
EXECUTE #temp
Instead use
EXEC sp_executesql #temp
You will also need to change your #temp variable to be nvarchar(max)

SQL Server : update records in dynamically generated tables using parameters in stored procedure

I have to create a stored procedure where I will pass tableName, columnName, id as parameters. The task is to select records from the passed table where columnName has passed id. If record is found update records with some fixed data. Also implement Transaction so that we can rollback in case of any error.
There are hundreds of table in database and each table has different schema that is why I have to pass columnName.
Don't know what is the best approach for this. I am trying select records into a temp table so that I can manipulate it as per requirement but its not working.
I am using this code:
ALTER PROCEDURE [dbo].[GetRecordsFromTable]
#tblName nvarchar(128),
#keyCol varchar(100),
#key int = 0
AS
BEGIN
SET NOCOUNT ON;
BEGIN TRY
--DROP TABLE #TempTable;
DECLARE #sqlQuery nvarchar(4000);
SET #sqlQuery = 'SELECT * FROM ' + #tblName + ' WHERE ' + #keyCol + ' = 2';
PRINT #sqlQuery;
INSERT INTO #TempTable
EXEC sp_executesql #sqlQuery,
N'#keyCol varchar(100), #key int', #keyCol, #key;
SELECT * FROM #TempTable;
END TRY
BEGIN CATCH
EXECUTE [dbo].[uspPrintError];
END CATCH;
END
I get an error
Invalid object name '#TempTable'
Also not sure if this is the best approach to get data and then update it.
If you absolutely must make that work then I think you'll have to use a global temp table. You'll need to see if it exists before running your dynamic sql and clean up. With a fixed table name you'll run into problems with other connections. Inside the dynamic sql you'll add select * into ##temptable from .... Actually I'm not even sure why you want the temp table in the first place. Can't the dynamic sql just return the results?
On the surface it seems like a solid idea to have one generic procedure for returning data with a couple of parameters to drive it but, without a lot of explanation, it's just not the way database are designed to work.
You should create the temp table.
IF OBJECT_ID('tempdb..##TempTable') IS NOT NULL
DROP TABLE ##TempTable
CREATE TABLE ##TempTable()

How to check existence of a table from a different sql db?

I have db A and db B. At the beginning of a stored procedure I want to back up all rows from B.mytable to B.mytablebackup. The rest of the stored procedure runs against tables on db A (which gathers data and writes it to B.mytable).
So I check to see if B.mytablebackup exists
IF EXISTS(SELECT 1 FROM B.dbo.mytablebackup)
and if it does, the stored procedure does an
INSERT INTO B..mytablebackup SELECT * FROM B..mytable
If it doesn't exist it does a
SELECT * INTO B..mytablebackup from B..mytable
But when I execute the stored procedure I get the error
There is already an object named 'mytablebackup' in the database
I added a Print statement and execution is taking the "does not exist" branch of the IF.
What am I doing wrong?
For SQL Server, you should use system view sys.tables to check if table exists.
IF EXISTS(SELECT 1 FROM B.sys.tables WHERE name = 'mytablebackup')
OBJECT_ID can be used too:
IF OBJECT_ID('B.dbo.mytablebackup') IS NOT NULL
You can directly check from the given DB,SCHEMA and TABLE parameters (For dynamic database, schema and table use)
DECLARE #targetdatabase NVARCHAR(MAX),
#SchemaName NVARCHAR(MAX),
#TableName NVARCHAR(MAX)
DECLARE #TempTableName NVARCHAR(MAX) = QUOTENAME(#targetdatabase) + '.' +
QUOTENAME(#SchemaName) + '.' + QUOTENAME(#TableName)
IF OBJECT_ID(#TempTableName) IS NULL
BEGIN
PRINT #TempTableName
END

Sharing stored procedure across database using synonyms

I have two different SQL Server databases (on the same server - if it helps) that need to share the same stored procedure logic. The solution I'm trying to achieve looks like this:
Database1
Table: TestTable
Synonym: sp_MyProc pointing at SharedDatabase.dbo.sp_MyProc
Database2
Table: TestTable
Synonym: sp_MyProc pointing at SharedDatabase.dbo.sp_MyProc
SharedDatabase
Proc: sp_MyProc which runs queries against TestTable
My hope was to use the synonyms so that if I execute sp_MyProc while in the context of Database1, it would use Database2.TestTable. And if I execute sp_MyProc while in the context of Database2, it would go against Database2.TestTable. However, when I execute sp_MyProc through either of the synonyms, it ignores the context of the synonym and executes looking for a local copy of TestTable, which is not found.
Is there a way to implement a shared stored procedure that executes against different copies of tables in different databases, either through synonyms or some other mechanism?
Edit
I should mention that in my case I am looking to do this with a large set of existing tables and procs, so any solution that requires modifying the procs or tables themselves are not ideal.
Something like this would work for the definition of the procedure. Be sure to guard against SQL injection since this is built dynamically.
CREATE PROCEDURE [dbo].dosomething
#databaseName sysname,
#schema sysname,
#tableName sysname
as
declare #cmd as nvarchar(max)
set #cmd = N'select * from ' + quotename(#schema) + N'.' + quotename(#tableName)
exec sp_executesql #cmd
Then use it like this:
dosomething 'SampleDb', 'dbo', 'sampleTable'
If the stored proc is in the SharedDatabase, then it will always run in context of SharedDatabase. To accomplish what you are trying to do to centralize code, I would maybe pass in a parameter to designate which server it is coming from, so then you can execute the query against that specific TestTable. Basically, you will need to refer to each table using their fully qualified name - i.e. Database1.dbo.TestTable
USE SharedDatabase
CREATE PROCEDURE [dbo].sp_MyProc
#dbsource varchar(50)
as
if(#dbsource == 'DB1')
begin
select * from Database1.dbo.TestTable
end
else
begin
select * from Database2.dbo.TestTable
end
GO
The other alternative is to make a view in SharedDatabase, which will be called TestTableComposite, with an extra column to identify where the source data is. And then pass that in as the parameter, and your SP on SharedDatabase will always be in context of that DB.

Resources