I need to have some users that are systematiically in the TRANSACTION ISOLATION LEVEL SNAPSHOT when they logs to the database.
I tried to use a LOGON trigger to set the user session to that isolation level when the ORIGINAL_LOGIN() was the user I wanted to promote... But this does not works...
I tried to see if it was possible to configure this in the ODBC connection strings.... but I found nothing
This is my actual code to test...
First part the source database :
CREATE DATABASE DB_SOURCE;
GO
ALTER DATABASE DB_SOURCE
SET ALLOW_SNAPSHOT_ISOLATION ON;
GO
USE DB_SOURCE;
GO
CREATE TABLE T (C INT);
GO
INSERT INTO T VALUES (1), (2), (3);
GO`enter code here`
Second part, the login and the user :
USE master;
GO
CREATE LOGIN CNX_USER_TEST
WITH PASSWORD = 'foo',
DEFAULT_DATABASE = DB_SOURCE;
GO
USE DB_SOURCE;
GO
CREATE USER USR_USER_TEST
FROM LOGIN CNX_USER_TEST;
GO
GRANT SELECT TO USR_USER_TEST;
GO
Third part, the logon trigger :
USE master;
GO
CREATE TRIGGER E_LOGON
ON ALL SERVER WITH EXECUTE AS N'sa'
FOR LOGON
AS
BEGIN
IF ORIGINAL_LOGIN()= N'CNX_USER_TEST'
BEGIN
COMMIT;
SET TRANSACTION ISOLATION LEVEL SNAPSHOT;
END;
END;
Fourth, the test...
In SSMS window 1 :
BEGIN TRAN ;
UPDATE T SET C = C + 1;
In SSMS in another window connected with CNX_USER_TEST :
SELECT * FROM T;
Blocked !
Related
I have the following issue:
I have two different databases, db1 and db2. I have an application that loads data into db2 or db3. db1 has a few tables that the application uses to determine behavior, including which db the application should load data into.
Users need to have write access to db1 to operate the application (there is a console application that writes to tables in db1, operating with windows authentication).
Users should not have DML privileges to db2 and db3, with the exception of a few predetermined operations. We grant AD groups database roles to control access from and organization perspective. Specifically, I'm trying to build a stored procedure in db1 that operators can use to reverse data loaded to db2 or db3 with appropriate logging.
I'm attempting to use create proc ... execute as owner to accomplish this, but it does not seem to be working when I try to hit tables in db2/db3 (I'm thinking that "execute as owner" operates on db level users an not server level logins?). The following causes a permission error stating that the owner (myself) does not have permissions to db2/db3.
use db1
go
create proc dbo.wrapper #recordid int
as begin
/*
capturing user
*/
declare #usr varchar(255) = SUSER_SNAME()
exec dbo.inner #usr , #recordid
end
use db1
go
create proc dbo.inner #usr varchar(255), #recordid int
with execute as owner
as begin
/*
logic to determine whether to update db2 or db3 goes here
*/
insert db2.rolled_back
select * , #usr from db2.transactions where id = #recordid
delete from db2.transactions where id = #recordid
insert db3.rolled_back
select * , #usr from db3.transactions where id = #recordid
delete from db3.transactions where id = #recordid
end
Is there a way to get this to work? I've heard that certificate signing could do this, does anyone have any experience using certificate users. Our DBA's would rather not have to maintain certificates, so if there is a way to get this to work without certificates that would be best.
Any advice would be helpful.
Thank You!
I'm going to cover the cross database chaining side of thing here. note that there are certainly security considerations when using this method. For example someone with permissions to create objects in one database can give themselves access to data in another database with the owner, when they themselves have no access to the other database. The security concerns, however, are out of scope of this answer.
Firstly, let's create a couple of test databases.
USE master;
GO
CREATE DATABASE Chain1;
CREATE DATABASE Chain2;
Now I'm going to CREATE a LOGIN, which is disable and make that the owner of these 2 databases. The databases having the same owner is important, as otherwise the chaining won't work.
CREATE LOGIN ChainerOwner WITH PASSWORD = N'SomeSecurePassword123';
ALTER LOGIN ChainerOwner DISABLE;
GO
ALTER AUTHORIZATION ON DATABASE::Chain1 TO ChainerOwner;
ALTER AUTHORIZATION ON DATABASE::Chain2 TO ChainerOwner;
I'm also going to create a LOGIN which we're going to use to test on:
CREATE LOGIN SomeUser WITH PASSWORD = N'SomeSecurePassword123';
Great, now I can create a few objects; a table in Chain1, a PROCEDURE in Chain2 that accesses the TABLE, and a USER in both databases for SomeUser. In Chain1 the USER will be given no permissions, and in Chain2 the user will be given the permsision to EXECUTE the PROCEDURE:
USE Chain1;
GO
CREATE TABLE dbo.SomeTable (I int IDENTITY,
S varchar(10));
INSERT INTO dbo.SomeTable (S)
VALUES ('abc'),
('xyz');
GO
CREATE USER SomeUser FOR LOGIN SomeUser;
GO
USE Chain2;
GO
CREATE PROC dbo.CrossDBProc #I int AS
BEGIN
SELECT I,
S
FROM Chain1.dbo.SomeTable
WHERE I = #I;
END;
GO
CREATE USER SomeUser FOR LOGIN SomeUser;
GO
GRANT EXECUTE ON dbo.CrossDBProc TO SomeUser;
GO
Great, all the objects are created, now let's try to EXECUTE that PROCEDURE:
EXECUTE AS LOGIN = 'SomeUser';
GO
EXEC dbo.CrossDBProc 1; --This fails
GO
REVERT;
GO
This fails, with a permission error:
The SELECT permission was denied on the object 'SomeTable', database 'Chain1', schema 'dbo'.
This is expected, as there is no ownership chaining. let's, therefore enable that now.
USE master;
GO
ALTER DATABASE Chain1 SET DB_CHAINING ON;
ALTER DATABASE Chain2 SET DB_CHAINING ON;
Now if I try the same again, the same SQL works:
USE Chain2;
GO
EXECUTE AS LOGIN = 'SomeUser';
GO
EXEC dbo.CrossDBProc 1; --This now works
GO
REVERT;
GO
This successfully returns the result set
I
S
1
abc
So, yes you can chain cross database, but it requires some set up, and (again) there are security considerations you need think about.
Clean up:
USE master;
GO
DROP DATABASE Chain1;
DROP DATABASE Chain2;
GO
DROP LOGIN ChainerOwner;
DROP LOGIN SomeUser;
I have the written the T-SQL code below. I want to put it in a SQL Server TRY...CATCH block. However, because I must execute some statements before proceeding with another statement, I am using the GO keyword and this makes the code crash with out executing the code in the CATCH block. It just crashes as if there was no CATCH block. If I remove the GOs in the code and the code crashes, the execution jumps to the CATCH block which is the desired behavior.
Any ideas on what I can do?
BEGIN TRY
RESTORE FILELISTONLY
FROM DISK = 'D:\Folder1\Database1.bak'
GO
ALTER DATABASE BusinessData
SET SINGLE_USER WITH
ROLLBACK IMMEDIATE
ALTER DATABASE BusinessData
SET RECOVERY Simple
RESTORE DATABASE BusinessData
FROM DISK = 'D:\Folder1\Database1.bak'
WITH MOVE 'BusinessData' TO 'C:\MyDATA
\BusinessData.mdf',
MOVE 'BusinessData_log' TO 'C:\MyDATA
\BusinessData_log.ldf'
ALTER DATABASE BusinessData SET MULTI_USER
GO
IF NOT EXISTS (SELECT * FROM sys.server_principals WHERE name = N'SERVER1\user1')
CREATE LOGIN [SERVER1\user1] FROM WINDOWS WITH DEFAULT_DATABASE=[master], DEFAULT_LANGUAGE=
[us_english]
GO
USE [ProjectServer_Authentication]
GO
IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'SERVER1\user1')
CREATE USER [SERVER1\user1] FOR LOGIN [SERVER1\user1] WITH DEFAULT_SCHEMA=[dbo]
GO
EXEC sp_addrolemember 'db_owner',N'SERVER1\user1'
GO
USE [BusinessData]
IF NOT EXISTS (SELECT * FROM sys.database_principals WHERE name = N'SERVER1\user1')
CREATE USER [SERVER1\user1] FOR LOGIN [SERVER1\user1] WITH DEFAULT_SCHEMA=[dbo]
GO
EXEC sp_addrolemember 'db_owner',N'SERVER1\user1'
GO
END TRY
BEGIN CATCH
USE msdb
GO
EXEC sp_send_dbmail #profile_name='My Mail Profile',
#recipients='myemailaccount#mydomain.org',
#subject='Refresh Error',
#body='Email body'
END CATCH
I think GO is the Problem as
GO is not a Transact-SQL statement;
it is a command recognized by the sqlcmd and osql utilities and SQL Server Management Studio Code editor.
SQL Server utilities interpret GO as a signal that they should send the current batch of Transact-SQL statements to an instance of SQL Server. The current batch of statements is composed of all statements entered since the last GO, or since the start of the ad hoc session or script if this is the first GO.
With every GO you start a new Statement, wich means your begin Try and your End Try are in 2 different Statements and therefore not working
I have a user who has db_datareader, db_datawriter permissions on a DB.
I want to set the isolation levels to the DB to which the user has access to.
What permissions will my user need to be able to set these.
DB used: SQL SERVER 2008
This is not setting an isolation level:
ALTER DATABASE dbname SET ALLOW_SNAPSHOT_ISOLATION ON;
That is altering the database. For that you need to provide them with ALTER rights on the database:
GRANT ALTER ON DATABASE::dbname TO username;
Otherwise you get this error:
Msg 5011, Level 14, State 9, Line 1
User does not have permission to alter database 'dbname', the database does not exist, or the database is not in a state that allows access checks.
Msg 5069, Level 16, State 1, Line 1
ALTER DATABASE statement failed.
Now, ALTER is all or nothing - you can't use it to allow them to change the allow snapshot setting but not other settings like forced parameterization, compatibility level, etc. You can do this much more granularly, though; perhaps you could create a stored procedure that does something like this:
CREATE PROCEDURE dbo.SetIsolationLevel
WITH EXECUTE AS OWNER
AS
BEGIN
SET NOCOUNT ON;
DECLARE #sql NVARCHAR(MAX) = N'ALTER DATABASE '
+ QUOTENAME(DB_NAME())
+ ' SET ALLOW_SNAPSHOT_ISOLATION ON;';
EXEC sp_executesql #sql;
END
GO
Now you just have to give the user EXEC permissions on that procedure, which your user can now call (instead of the ALTER DATABASE command explicitly and instead of giving them full ALTER DATABASE privileges):
GRANT EXEC ON dbo.SetIsolationLevel TO username;
GO
You can simulate them calling this stored procedure by logging in as them, or using the EXECUTE AS feature directly:
EXECUTE AS USER = 'username';
GO
EXEC dbo.SetIsolationLevel;
GO
REVERT;
GO
Another idea is to simply set the model database to have this setting, than any new databases that get created for your users will automatically inherit it, then you don't have to worry about making them turn it on.
ALTER DATABASE model SET ALLOW_SNAPSHOT_ISOLATION ON;
GO
CREATE DATABASE splunge;
GO
SELECT snapshot_isolation_state_desc FROM sys.databases WHERE name = N'splunge';
Result:
ON
During my integration tests, I try to drop database using:
USE master
ALTER DATABASE TestXyz SET SINGLE_USER WITH ROLLBACK IMMEDIATE
DROP DATABASE TestXyz
However, quite often (given the number of tests) one of the application background processes manages to get between SET SINGLE_USER and DROP DATABASE, which makes it single user of the database and breaks the DROP.
I can not use RESTRICTED_USER, as the application currently has db_owner permission (due to a large amount of legacy code, some of which requires it, so it will not be changed just for the tests).
I can not use OFFLINE as it does not delete database files from the disk.
How would you solve this problem?
OK plan b... iterate a drop of connections and rename the DB to get it away from the applications domain. Then drop it. To handle iterating through connections a try catch on the rename will hopefully allow it to run until it is able to drop the connection. Example code below creates a DB TestDB; renames it to testdb2 in the while loop before dropping it after the loop has succeeded.
-- Setup a scratch Db for testing
create database testdb
go
use testdb
while exists (select name from sys.databases where name = 'testdb')
Begin
DECLARE #DbName nvarchar(50) SET #DbName = N'testdb'
DECLARE #EXECSQL varchar(max) SET #EXECSQL = ''
SELECT #EXECSQL = #EXECSQL + 'Kill ' + Convert(varchar, SPId) + ';'
FROM MASTER..SysProcesses
WHERE DBId = DB_ID(#DbName) AND SPId <> ##SPId
EXEC(#EXECSQL)
Begin try
EXEC sp_renamedb 'testdb', 'testdb2'
end try
Begin Catch
print 'failed to rename'
End Catch
end
drop database testdb2
Try this once:
Stop application services and run your query.
Stop application services and restart SQL Server Services and then run your query.
I have finally solved it using the following approach:
ALTER LOGIN MyAppUser DISABLE
ALTER DATABASE TestXyz SET SINGLE_USER WITH ROLLBACK IMMEDIATE
DROP DATABASE TestXyz
ALTER LOGIN MyAppUser ENABLE
Since I can use different login for test database management process, this allows me to block application from accessing the DB. (The reason for SINGLE_USER here is just to kick already connected users. I haven't checked if ALTER LOGIN already does that, but I assume it does not).
Alternative option is to delete MyAppUser from the database before dropping it, however I thought about it only now and do not have code for it.
I have an INSERT trigger on a table that simply executes a job.
Example:
CREATE TABLE test
(
RunDate smalldatetime
)
CREATE TRIGGER StartJob ON test
AFTER INSERT
AS
EXEC msdb.dbo.sp_start_job 'TestJob'
When I insert a record to this table, the job is fired of without any issue. There are a few people, however, that have lower permissions than I do (db_datareader/db_datawriter on the database only); they are able to insert a record to the table, but the trigger does not fire.
I am a SQL Server novice and I was under the impression that users did not need elevated permissions to fire off a trigger (I thought that was one of the big benefits!). Is this a permission issue at the trigger level, or at the job level? What can I do to get around this limitation?
The trigger will execute in the context of the caller, which may or may not have the permissions to access msdb. That seems to be your problem. There are a few ways to extend these permissions using Execute As; they are greatly detailed in this link
Use impersonation within trigger:
CREATE TRIGGER StartJob ON test
with execute as owner
AFTER INSERT
AS
EXEC msdb.dbo.sp_start_job 'TestJob'
And set database to trustworthy (or read about signing in above link):
alter database TestDB set trustworthy on
Another way to go (depending on what operations the agent job performs) would be to leverage a Service Broker queue to handle the stored procedure activation. Your users' context would simply call to Send On the queue while, in an asynchronous process SvcBroker would activate a stored procedure which executed in context of higher elevated user. I would opt for this solution rather than relying on a trigger calling an agent job.
I wanted to test the call to Service Broker, so I wrote this simple test example. Instead of calling an SSIS package I simply send an email, but it is very similar to your situation. Notice I use SET TRUSTWORTHY ON at the top of the script. Please read about the implications of this setting.
To run this sample you will need to substitute your email profile info below, <your_email_address_here>, etc.
use Master;
go
if exists(select * from sys.databases where name = 'TestDB')
drop database TestDB;
create database TestDB;
go
alter database TestDB set ENABLE_BROKER;
go
alter database TestDB set TRUSTWORTHY ON;
use TestDB;
go
------------------------------------------------------------------------------------
-- create procedure that will be called by svc broker
------------------------------------------------------------------------------------
create procedure dbo.usp_SSISCaller
as
set nocount on;
declare #dlgid uniqueidentifier;
begin try
-- * figure out how to start SSIS package from here
-- for now, just send an email to illustrate the async callback
;receive top(1)
#dlgid = conversation_handle
from SSISCallerQueue;
if ##rowcount = 0
begin
return;
end
end conversation #dlgid;
exec msdb.dbo.sp_send_dbmail
#profile_name = '<your_profile_here>',
#importance = 'NORMAL',
#sensitivity = 'NORMAL',
#recipients = '<your_email_address_here>',
#copy_recipients = '',
#blind_copy_recipients = '',
#subject = 'test from ssis caller',
#body = 'testing',
#body_format = 'TEXT';
return 0;
end try
begin catch
declare #msg varchar(max);
select #msg = error_message();
raiserror(#msg, 16, 1);
return -1;
end catch;
go
------------------------------------------------------------------------------------
-- setup svcbroker objects
------------------------------------------------------------------------------------
create contract [//SSISCallerContract]
([http://schemas.microsoft.com/SQL/ServiceBroker/DialogTimer] sent by initiator)
create queue SSISCallerQueue
with status = on,
activation (
procedure_name = usp_SSISCaller,
max_queue_readers = 1,
execute as 'dbo' );
create service [//SSISCallerService]
authorization dbo
on queue SSISCallerQueue ([//SSISCallerContract]);
go
return;
-- usage
/*
-- put a row into the queue to trigger the call to usp_SSISCaller
begin transaction;
declare #dlgId uniqueidentifier;
begin dialog conversation #dlgId
from service [//SSISCallerService]
to service '//SSISCallerService',
'CURRENT DATABASE'
on contract [//SSISCallerContract]
with encryption = off;
begin conversation timer (#dlgId)
TIMEOUT = 5; -- seconds
commit transaction;
*/
It would be permissions at the job level. You can possibly assign those users the SQLAgentReaderRole in MSDB to be able to start a job, considering that they would be added to a group that owned the job. If they are not in a group which owns the job, it gets more difficult.