Could this cursor be optimized or rewritten for optimum performance? - sql-server

There is a need to update all of our databases on our server and perform the same logic on each one. The databases in question all follow a common naming scheme like CorpDB1, CorpDB2, etc. Instead of creating a SQL Agent Job for each of the databases in question (over 50), I have thought about using a cursor to iterate over the list of databases and then perform some dynamic sql on each one. In light of the common notion that cursors should be a last resort; could this be rewritten for better performance or written another way perhaps with the use of the undocumented sp_MSforeachdb stored procedure?
DECLARE #db VARCHAR(100) --current database name
DECLARE #sql VARCHAR(1000) --t-sql used for processing on each database
DECLARE db_cursor CURSOR FAST_FORWARD FOR
SELECT name
FROM MASTER.dbo.sysdatabases
WHERE name LIKE 'CorpDB%'
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO #db
WHILE ##FETCH_STATUS = 0
BEGIN
SET #sql = 'USE ' + #db +
' DELETE FROM db_table --more t-sql processing'
EXEC(#sql)
FETCH NEXT FROM db_cursor INTO #db
END
CLOSE db_cursor
DEALLOCATE db_cursor

Cursors are bad when they are used to tackle a set-based problem with procedural code. I don't think a cursor is necessarily a bad idea in your scenario.
When operations need to be run against multiple databases (backups, integrity checks, index maintenance, etc.), there's no issue with using a cursor. Sure, you could build a temp table that contains database names and loop through that...but it's still a procedural approach.
For your specific case, if you're not deleting rows in these tables based on some WHERE clause criteria, consider using TRUNCATE TABLE instead of DELETE FROM. Differences between the two operations explained here. Note that the user running TRUNCATE TABLE will need ALTER permission on the affected objects.

This will collect the set of delete statements and run them all in a single sequence. This is not necessarily going to be better performance-wise but just another way to skin the cat.
DECLARE #sql NVARCHAR(MAX); -- if SQL Server 2000, use NVARCHAR(4000)
SET #sql = N'';
SELECT #sql = #sql + N';DELETE ' + name + '..db_table -- more t-sql'
FROM master.sys.databases
WHERE name LIKE N'CorpDB%';
SET #sql = STUFF(#sql, 1, 1, '');
EXEC sp_executesql #sql;
You may consider building the string in a similar way inside your cursor instead of running EXEC() inside for each command. If you're going to continue using a cursor, use the following declaration:
DECLARE db_cursor CURSOR
LOCAL STATIC FORWARD_ONLY READ_ONLY
FOR
This will have the least locking and no unnecessary tempdb usage.

Related

Adding the postfix "_DELETE" to all stored procedures identified as not being used

In attempting to clean up my database, I have managed to identity a list of stored procedures that aren't being used. I want to mark these for deletion, adding the post-fix "_DELETE" to all of these in one script. Can anyone advise me on how to go about this please? Thank you.
Try to use cursor for this purpose:
DECLARE #mockupTable TABLE(ID INT IDENTITY, SPName VARCHAR(100));
INSERT INTO #mockupTable VALUES
('old_proc_name1')
,('old_proc_name2')
,('old_proc_name3')
DECLARE #name VARCHAR(50) = 'deleted'
DECLARE #newName VARCHAR(50)
DECLARE db_cursor CURSOR FOR
SELECT SPName FROM #mockupTable
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO #name
WHILE ##FETCH_STATUS = 0
BEGIN
SET #newName = CONCAT(#name, '_DELETED')
--PRINT #newname
EXEC sp_rename #name, #newname
FETCH NEXT FROM db_cursor INTO #name
END
CLOSE db_cursor
DEALLOCATE db_cursor
Assuming you already have the names for your procedures you can just substitute them into your cursor definition below:
DECLARE ProcCursor CURSOR LOCAL STATIC READ_ONLY FORWARD_ONLY
FOR
SELECT Name = CONCAT(QUOTENAME(s.name), '.', QUOTENAME(p.name)),
[NewName] = CONCAT(p.name, '_DELETE')
FROM sys.procedures AS p
INNER JOIN sys.schemas AS s
ON s.schema_id = p.schema_id
WHERE p.object_id IN
( OBJECT_ID('dbo.SomeProc', 'P'),
OBJECT_ID('dbo.SomeProc2', 'P'),
OBJECT_ID('dbo.SomeProc3', 'P')
);
DECLARE #Name NVARCHAR(776), #NewName SYSNAME;
OPEN ProcCursor;
FETCH NEXT FROM ProcCursor INTO #Name, #NewName
WHILE (##FETCH_STATUS = 0)
BEGIN
EXECUTE sp_rename #Name, #NewName, 'OBJECT';
FETCH NEXT FROM ProcCursor INTO #Name, #NewName
END
CLOSE ProcCursor;
DEALLOCATE ProcCursor;
It is worth noting that this is one of the very few scenarios where I would advocate using a cursor, but as above when using a cursor you should always ensure you explicitly declare the simplest cursor possible (e.g. LOCAL STATIC READ_ONLY FORWARD_ONLY). By telling SQL Server your cursor will be static, only used locally, and only ever read and only in one direction, your cursor will be much faster than if you don't and SQL Server has to work on the assumption that anything could happen with the cursor. On a small scale like this it is unlikely to make a tangible difference, but on larger sets it can make a considerable difference.
For further reading see What impact can different cursor options have? - The conclusion is actually that I should have used LOCAL FAST_FORWARD, rather than the options I did use. I have left these in though, as the difference is negligible and I found using all 4 easier to remember and displays intent clearer.
To re-iterate what has been stated in comments, it really is a good idea to use version control on your databases, that way you don't need to mark anything for deletion, you can just delete it, and still retain the definition in your source control. If you can't use source control for whatever reason, DDL triggers can provide rudimentary change tracking

SQL server cursor slow performance

I'm getting started with my first use of a cursor in a stored procedure in sql server 2008. I've done some preliminary reading and I understand that they have significant performance limitations. In my current case I think they're necessary (I want to run multiple stored procedures for each stock symbol in a symbols table.
Edit:
The sprocs I'll be calling on each symbol will for the most part be insert operations to calculate symbol- dependent values, such as 5 day moving average, average daily volume, ATR (average true range). Most of these values will be calculated from data from a daily pricing and volume table... I'd like to streamline the retrieval of data values that would be retrieved redundantly otherwise... for example, I'd like to get for each symbol the daily pricing and volume data into a table variable... that temp table will then be passed in to the stored procedure that calls each of the aggregated functions I just mentioned. Hope that makes sense...
So my initial "outer loop" cursor- based stored procedure is below.. it times out after several minutes, without returning anything to the output window.
ALTER PROCEDURE dbo.sprocSymbolDependentAggsDriver2
AS
DECLARE #symbol nchar(10)
DECLARE symbolCursor CURSOR
STATIC FOR
SELECT Symbol FROM tblSymbolsMain ORDER BY Symbol
OPEN symbolCursor
FETCH NEXT FROM symbolCursor INTO #symbol
WHILE ##FETCH_STATUS = 0
SET #symbol = #symbol + ': Test.'
FETCH NEXT FROM symbolCursor INTO #symbol
CLOSE symbolCursor
DEALLOCATE symbolCursor
When I run it without the #symbol local variable and eliminate the assignment to it in the while loop, it seems to run ok. Is there a clear violation of performance best- practices within that assignment? Thanks..
"In my current case I think they're necessary (I want to run multiple
stored procedures for each stock symbol in a symbols table."
Cursors are rarely necessary.
From your example above, I think a simple WHILE loop will easily take the place of your cursor. Adapted from SQL Cursors - How to avoid them (one of my favorite SQL bookmarks)
-- Create a temporary table...
CREATE TABLE #Symbols (
RowID int IDENTITY(1, 1),
Symbol(nvarchar(max))
)
DECLARE #NumberRecords int, #RowCount int
DECLARE #Symbol nvarchar(max)
-- Get your data that you want to loop over
INSERT INTO #Symbols (Symbol)
SELECT Symbol
FROM tblSymbolsMain
ORDER BY Symbol
-- Get the number of records you just grabbed
SET #NumberRecords = ##ROWCOUNT
SET #RowCount = 1
-- Just do a WHILE loop. No cursor necessary.
WHILE #RowCount <= #NumberRecords
BEGIN
SELECT #Symbol = Symbol
FROM #Symbols
WHERE RowID = #RowCount
EXEC <myProc1> #Symbol
EXEC <myProc2> #Symbol
EXEC <myProc3> #Symbol
SET #RowCount = #RowCount + 1
END
DROP TABLE #Symbols
You don't really need all that explicit cursor jazz to build a string. Here is probably a more efficient way to do it:
DECLARE #symbol NVARCHAR(MAX) = N'';
SELECT #symbol += ': Test.'
FROM dbo.tblSymbolsMain
ORDER BY Symbol;
Though I suspect you actually wanted to see the names of the symbol, e.g.
DECLARE #symbol NVARCHAR(MAX) = N'';
SELECT #symbol += N':' + Symbol
FROM dbo.tblSymbolsMain
ORDER BY Symbol;
One caveat is that while you will typically observe the order to be observed, it is not guaranteed. So if you want to stick to the cursor, at least declare the cursor as follows:
DECLARE symbolCursor CURSOR
LOCAL STATIC READ_ONLY FORWARD_ONLY
FOR
...
Also it seems to me like NCHAR(10) is not sufficient to hold the data you're trying to stuff into it, unless you only have one row (which is why I chose NVARCHAR(MAX) above).
And I agree with Abe... it is quite possible you don't need to fire a stored procedure for every row in the cursor, but to suggest ways around that (which will almost certainly be more efficient), we'd have to understand what those stored procedures actually do.
you need an begin end here:
WHILE ##FETCH_STATUS = 0 BEGIN
SET #symbol = #symbol + ': Test.'
FETCH NEXT FROM symbolCursor INTO #symbol
END
also try DECLARE symbolCursor CURSOR LOCAL READ_ONLY FORWARD_ONLY instead of STATIC to improve performance.
After reading all the suggestions, I ended up doing some old trick and it worked miracles!
I had this cursor which was taking almost 3 mins to run, while the enclosing query was instant. I have other databases with more complex cursors that were only taking 1 second or less, so I ruled out the global issue on using cursors. My solution:
Detach the database in question, but ensure you tick Update Statistics.
Attach the database and check performance
This seems to help optimize all the performance parameters without the detailed effort. I am using SQL Express 2008 R2.
Would like to know your experience.

Looping through list of database to drop and recreate a view

I am unsure if it is possible to do this with a loop in TSQL.
I have a list of Databases I would like to loop through and for each database drop and recreate the view as necessary.
I have a script already to drop and recreate the view. However, presently at the top of my script I have a bunch of use statements and i just go down the list uncommenting and reruning the script. However I would like to automate it so it would be much faster. I have done a similar thing in the past looping though a string of database names and using it to execute a use statement. Which works except when creating views because the view has to be seperated by go statements. Here is a code snippet of my code to parse databases any help to make work with views would be great.
DECLARE #DBs AS VARCHAR(MAX);
DECLARE #OneDB AS VARCHAR(255);
DECLARE #CmdToExec AS VARCHAR(MAX);
SET #DBs = 'db1,db2,db3,db4,db5,db6,db7,db8,db9,db10,db11,db12,db13,db14,db15,db16'
DECLARE DB_Cursor CURSOR FOR
SELECT SUBSTRING( ',' + #DBs + ',', n + 1,
CHARINDEX( ',', ',' + #DBs + ',', n + 1 ) - n - 1 ) AS "dbInfo"
FROM CommunityPAL.dbo.Numbers
WHERE SUBSTRING( ',' + #DBs + ',', n, 1 ) = ','
AND n < LEN( ',' + #DBs + ',' );
OPEN DB_Cursor;
FETCH NEXT FROM DB_Cursor INTO #OneDB;
WHILE ##FETCH_STATUS = 0
BEGIN
SET #CmdToExec = 'USE ' + #OneDB;
EXEC #CmdToExec;
--statements to execute
FETCH NEXT FROM DB_Cursor INTO #OneDB;
END
CLOSE DB_Cursor
DEALLOCATE DB_Cursor
I am pretty sure you can't do what you want exclusively within T-SQL. CREATE VIEW (and ALTER VIEW) do not use the three-part naming convention (database.schema.object). It has to be the only statement in a batch, so it cannot be combined with a USE command within a dynamic SQL call (EXEC or sp_executeSQL), and the effects of that USE will only last until the duration of that batch (i.e. until any embedded GO).
However, I have done similar work from “outside” SQL server, updating specific objects in all databases of a certain "type" (where we have one per client/customer). It gets complex. A brief outline:
Use the scripting (or programming) language of your choice
Loop over a list of databases
Execute SQLCMD once for each database, and execute your [RE]CREATE VIEW script
Please check the sp_msforeachdb procedure to loop for the all databases:
Take advantage of undocumented SQL Server iteration procedures

Determine a cursor by condition

In SQL Server for CURSOR we say:
CREATE PROCEDURE SP_MY_PROC
(#BANKID VARCHAR(6)='')
-------------------------------
-------------------------------
DECLARE MY_CURSOR CURSOR FOR
SELECT .......
Now, what I wonder, can we determine the select statement according to a cerain condition?
IF BANKID<>''// SELECT * FROM EMPLOYESS WHERE BANKID=#BANKID to be the cursors query
ELSE // otherwise SELECT * FROM EMPLOYEES to be the cursors query
Or does it have to be static?
Yes, you can do this with Dynamic SQL
IF #BANKID<> ''
SET #sql = '
DECLARE MyCursor CURSOR FOR
SELECT ...'
ELSE
SET #sql = '
DECLARE MyCursor CURSOR FOR
SELECT ...'
EXEC sp_executesql #sql
OPEN MyCursor
If it is such a simple example, it's better to re-write it as a single query:
DECLARE MY_CURSOR CURSOR FOR
SELECT * FROM EMPLOYESS WHERE BANKID=#BANKID or #BANKID=''
And, of course, we haven't addressed whether a cursor is the right solution for the larger problem or not (cursors are frequently misused by people not used to thinking of set based solutions, which is what SQL is good at).
PS - avoid prefixing your stored procedures with sp_ - These names are "reserved" for SQL Server, and should be avoided to prevent future incompatibilities (and ignoring, for now, that it's also slower to access stored procs with such names, since SQL Server searches the master database before searching in the current database).

Best way to deploy user defined functions to multiple databases in SQL 2005

I manage a server with around 400+ databases which have the same database schema, i wish to deploy a custom clr/.net user defined function to them all, is there any easy way to do this, or must it be done individually to each database?
Best Regards,
Wayne
I think if you create it in master (in MSSQL) it can be referenced from any other db in that instance. Certainly seems to work for Stored Procs anyway.
I should add that this only works if the databases are all on the same server instance...
You could write a small app to deploy the udf to the master of each SQL server instance if all 400 reside on multiple servers.
I just find some way i think. Some kind of Inception ;)
USE data_base works only inner EXECUTE context, so...
DECLARE #sql NVARCHAR(max)
DECLARE #innersql NVARCHAR(max)
DECLARE c CURSOR READ_ONLY
FOR
SELECT name FROM sys.databases
DECLARE #name nvarchar(1000)
OPEN c
SET #innersql = 'CREATE FUNCTION Foo(#x varchar(1)) RETURNS varchar(100) AS ' +
' BEGIN RETURN(#x + ''''some text'''') END;'
-- ^^^^ ^^^^
-- every (') must be converted to QUAD ('''') instead of DOUBLE('') !!!
FETCH NEXT FROM c INTO #name
WHILE ##FETCH_STATUS = 0
BEGIN
-- create function must be the first statement in a query batch ???
-- ok, will be... in inner EXEC...
SET #sql = 'USE [' + #name + ']; EXEC (''' + #innersql + ''');'
--PRINT #sql
EXEC (#sql)
FETCH NEXT FROM c INTO #name
END
CLOSE c
DEALLOCATE c
Ups, I missed "CLR/.NET" part reading question. Sorry.
i'd just create a dynamic script to create it on each database. but after that i'd put it in the modal databases so that all new databases are created with it.
you could also use the same script to push out changes if the function ever gets modified.

Resources