Is there a way to re-compile or at least 'check compile' stored procedures en masse? Sometimes we'll make schema changes - add or drop a column, etc. And do our best to identify affected procs only to be bitten by one we missed, which pukes when it runs next. SQLServer 2k5 or 2k8.
I understand your question as 'when I make a schema change, I want to validate all procedures that they still execute correctly with the new schema'. Ie. if you drop a column that is referenced in a SELECT in a procedure, then you want it flagged as it requires changes. So specifically I do not understand your question as 'I want the procedure to recompile on next execution', since that job is taken care of for you by the engine, which will detect the metadata version change associated with any schema alteration and discard the existing cached execution plans.
My first observation is that what you describe in your question is usually the job of a TEST and you should have a QA step in your deployment process that validates the new 'build'. The best solution you could have is to implement a minimal set of unit tests that, at the very least, iterates through all your stored procedures and validates the execution of each for correctness, in a test deployment. That would pretty much eliminate all surprises, at least eliminate them where it hurts (in production, or at customer site).
Your next best option is to rely on your development tools to track these dependencies. The Visual Studio Database 2008 Database Edition provides such functionality out-of-the box and it will take care of validating any change you make in the schema.
And finally your last option is to do something similar to what KM suggested: automate an iteration through all your procedures depending on the modified object (and all procedures depending on the dependent ones and so on and so forth recursively). It won't suffice to mark the procedures for recompilation, what you really need is to run the ALTER PROCEDURE to trigger a parsing o its text and a validation of the schema (things are a bit different in T-SQL vs. your usual language compile/execute cycle, the 'compilation' per se occurs only when the procedure is actually executed). You can start by iterating through the sys.sql_dependencies to find all dependencies of your altered object, and also find the 'module definition' of the dependencies from sys.sql_modules:
with cte_dep as (
select object_id
from sys.sql_dependencies
where referenced_major_id = object_id('<your altered object name>')
union all
select d.object_id
from sys.sql_dependencies d
join cte_dep r on d.referenced_major_id = r.object_id
)
, cte_distinct as (
select distinct object_id
from cte_dep)
select object_name(c.object_id)
, c.object_id
, m.definition
from cte_distinct c
join sys.sql_modules m on c.object_id = m.object_id
You can then run through the dependent 'modules' and re-create them (ie. drop them and run the code in the 'definition'). Note that a 'module' is more generic than a stored procedure and covers also views, triggers, functions, rules, defaults and replication filters. Encrypted 'modules' will not have definition the definition available and to be absolutely correct you must also account for the various settings captured in sys.sql_modules (ansi nulls, schema binding, execute as clauses etc).
If you use ynamic SQL, that cannot be verified. It will not be captured by sys.sql_dependencies, nor it will be validated by 're-creating' the module.
Overall I think your best option, by a large margin, is to implement the unit tests validation.
If you have problems changing a table and breaking stored procedures try sp_depends:
sp_depends [ #objname = ] '<object>'
<object> ::=
{
[ database_name. [ schema_name ] . | schema_name.
object_name
}
and identity them before they break. Use it this way:
EXECUTE sp_depends YourChangedTableName
Also, you could use sp_recompile:
EXEC sp_recompile YourChangedTable
but that only marks associated stored procedures for recompile when they are run the next time.
You could use management studio or your source control to generate a concatenated create script of all procedures into a single file and then run that.
I know what you mean and in many scenario's in recognize your need. You might have a look at sp_refreshsqlmodule.
Good luck, Ron
You may be able to use DBCC FREEPROCCACHE
http://msdn.microsoft.com/en-us/library/ms174283.aspx
Just iterate through them, after getting the list from sysobjects, and run sp_recompile:
Here is a link showing a sample script:
http://database.ittoolbox.com/groups/technical-functional/sql-server-l/recompile-all-stored-procedures-2764478
Related
I've been looking at logging procedure executions on our reporting database to a table, and aim to come up with a generic snippet of code that can be copied into any proc we want to log.
The above lead me to play around with ##ProcID. The Microsoft documentation explains that this will provide the object ID of the proc, udf, or trigger within which it is contained. That makes sense, but I'm also seeing it return a value when run directly from a new query window. I've not been able to relate this integer to an object id in the database - I have no idea what this Id represents. I'm sysadmin on the server I'm trying this on so there shouldn't be any permission restrictions.
I haven't managed to find anything online about this - the only search result which looked relevant is on a login restricted SAP support forum.
Use Master
select ##procid -- returns an integer
select object_name(##procid) -- NULL
select * from sys.objects where object_id = ##ProcId -- 0 rows
While this isn't documented, the value corresponds to the objectid attribute of the cached query plan, as returned by sys.dm_exec_plan_attributes. The meaning of that is documented: "For plans of type "Adhoc" or "Prepared", it is an internal hash of the batch text."
To confirm, the following query returns the text of the query itself (and thus serves as a form of quine for SQL Server, albeit one that cheats as it inspects runtime values):
SELECT t.[text]
FROM sys.dm_exec_cached_plans p
CROSS APPLY sys.dm_exec_sql_text(p.plan_handle) t
CROSS APPLY sys.dm_exec_plan_attributes(p.plan_handle) a
WHERE a.attribute = 'objectid' AND a.value = ##PROCID
It depends what tool you are using to submit the command. Many tools will create a temporary stored procedure containing your commands (using ODBC prepared statement for example), and then run that procedure.
Speculating, it may be that the tool is detecting that the statement is unchanged and therefore re-using the previous prepared statement. In this case SQL server would not be involved, it would be the client library.
Alternatively, it may be that the server is detecting the sql is unchanged, and the preserved procid is a consequence of query-plan caching system. (SQL server attempts to detect repeated ad-hoc statements and optimise by re-using the plans for them.)
Either way you should consider this a curiosity, not something you should rely on for correct operation of your system as it may well change with updates to SQL Server or your client library.
I currently have a large set of sql scripts transforming data from one table to another, often in steps, like for example
select input3.id as cid
, input4.name as cname
into #temp2
from input3
inner join input4
on input3.match = input4.match
where input3.regdate > '2019-01-01';
truncate table output1;
insert into output1 (customerid, customername)
select cid, cname from #temp2;
I would like to "parse" these scripts into their basic inputs and outputs
in: input3, input4
out: output1
(not necessarily this format, just this info)
To have the temporary tables falsely flagged would not be a problem:
in: input3, input4, #temp2
out: #temp2, output1
It is OK to take a little bit of time, but the more automatic, the better.
How would one do this?
Things I tried include
regexes (straight forward but will miss edge cases, mainly falsely flagging tables in comments)
Use an online parser to list the DB objects, postprocessing by hand
Look into solving it programmatically, but for example writing a C# program for this will cost too much time
I usually wrap the scripts' content into stored procedures and deploy them into the same database where the tables are located. If you are sufficiently acquainted with (power)shell scripting and regexps, you can even write the code which will do it for you.
From this point on, you have some alternatives:
If you need a complete usage / reference report, or it's a one-off task, you can utilise the sys.sql_expression_dependencies or other similar system views;
Create a SSDT database project from that database. Among many other things that make database development easier and more consistent, SSDT has the "Find all references" functionality (Shift+F12 hotkey) which displays all references of a particular object (or column) across the code.
AFAIK neither of them sees through dynamic SQL, so if you have lots of it, you'll have to look elsewhere.
I am trying to hunt down a certain stored procedure which writes to certain table (it needs to be changed) however going through every single stored procedure is not a route I really want to take. So I was hoping there might be a way to find out which stored procedures INSERT or UPDATE certain table.
I have tried using this method (pinal_daves_blog), but it is not giving me any results.
NOTICE: The stored procedure might not be in the same DB!
Is there another way or can I somehow check what procedure/function has made the last insert or update to table.
One brute-force method would be to download an add-in from RedGate called SQL Search (free), then do a stored procedure search for the table name. I'm not affiliated at all with RedGate or anything, this is just a method that I have used to find similar things and has served me well.
http://www.red-gate.com/products/sql-development/sql-search/
If you go this route, you just type in the table name, change the 'object types' ddl selection to 'Procedures' and select 'All databases' in the DB ddl.
Hope this helps! I know it isn't the most technical solution, but it should work.
There is no built-in way to tell what function, procedure, or executed batch has made the last change to a table. There just isn't. Some databases have this as part of their transaction logging but SQL Server isn't one of them.
I have wondered in the past whether transactional replication might provide that information, if you already have that set up, but I don't know whether that's true.
If you know the change has to be taking place in a stored procedure (as opposed to someone using SSMS or executing lines of SQL via ADO.NET), then #koppinjo's suggestion is a good one, as is this one from Pinal Dave's blog:
USE AdventureWorks
GO
--Searching for Empoloyee table
SELECT Name
FROM sys.procedures
WHERE OBJECT_DEFINITION(OBJECT_ID) LIKE '%Employee%'
There are also dependency functions, though they can be outdated or incomplete:
select * from sys.dm_sql_referencing_entities( 'dbo.Employee', 'object' )
You could run a trace in Profiler. The procedure would have to write to the table while the trace is running for you to catch it.
Does anyone know of a way to verify the correctness of the queries in all stored procedures in a database?
I'm thinking of the scenario where if you modify something in a code file, simply doing a rebuild would show you compilation errors that point you to places where you need to fix things. In a database scenario, say if you modify a table and remove a column which is used in a stored procedure you won't know anything about this problem until the first time that procedure would run.
What you describe is what unit testing is for. Stored procedures and functions often require parameters to be set, and if the stored procedure or function encapsulates dynamic SQL--there's a chance that a [corner] case is missed.
Also, all you mention is checking for basic errors--nothing about validating the data returned. For example - I can change the precision on a numeric column...
This also gets into the basic testing that should occur for the immediate issue, and regression testing to ensure there aren't unforeseen issues.
You could create all of your objects with SCHEMABINDING, which would prevent you from changing any underlying tables without dropping and recreating the views and procedures built on top of them.
Depending on your development process, this could be pretty cumbersome. I offer it as a solution though, because if you want to ensure the correctness of all procedures in the db, this would do it.
I found this example on MSDN (SQL Server 2012). I guess it can be used in some scenarios:
USE AdventureWorks2012;
GO
SELECT p.name, r.*
FROM sys.procedures AS p
CROSS APPLY sys.dm_exec_describe_first_result_set_for_object(p.object_id, 0) AS r;
Source: sys.dm_exec_describe_first_result_set_for_object
I need to manually migrate modified stored procedures from a DEV SQL Server 2005 database instance to a TEST instance. Except for the changes I'm migrating, the databases have the same schemas. How can I quickly identify which stored procedures have been modified in the DEV database for migration to the TEST instance?
I assume I can write a query against some of the system tables to view database objects of type stored procedure, sorting by some sort of last modified or compiled data, but I'm not sure. Maybe there is some sort of free utility someone can point me to.
instead of using sysobjects which is not recommended anymore use sys.procedures
select name,create_date,modify_date
from sys.procedures
order by modify_date desc
you can do the where clause yourself but this will list it in order of modification date descending
You can execute this query to find all stored procedures modified in the last x number of days:
SELECT name
FROM sys.objects
WHERE type = 'P'
AND DATEDIFF(D,modify_date, GETDATE()) < X
There are some special cases where scripts might not give optimal results.
One is deleting stored procedures or other objects in dev environment – you won’t catch this using system views because object won’t exist there any longer.
Also, I’m not really sure this approach can work on changes such as permissions and similar.
In such cases its best to use some third party tool just to double check nothing is missed.
I’ve successfully used ApexSQL Diff in the past for similar tasks and it worked really good on large databases with 1000+ objects but you can’t go wrong with SQL Compare that’s already mentioned here or basically any other tool that exists on the market.
Disclaimer: I’m not affiliated with any of the vendors I’m mentioning here but I do use both set of tools (Apex and RG) in the company I work for.
Although not free I have had good experience using Red-Gates SQL Compare tool. It worked for me in the past. They have a free trial available which may be good enough to solve your current issue.
you can also use the following code snipet
USE AdventureWorks2008;
GO
SELECT SprocName=name, create_date, modify_date
FROM sys.objects
WHERE type = 'P'
AND name = 'uspUpdateEmployeeHireInfo'
GO
There are several database compare tools out there. One that I've always like is SQLCompare by Red Gate.
You can also try using:
SELECT name
FROM sys.objects
WHERE modify_date > #cutoffdate
In SQL 2000 that wouldn't have always worked, because using ALTER didn't update the date correctly, but in 2005 I believe that problem is fixed.
I use a SQL compare tool myself though, so I can't vouch for that method 100%
You can use following type of query to find modified stored procedures , you can use any number then 7 as per your needs
SELECT name
FROM sys.objects
WHERE type = 'P'
AND DATEDIFF(D,modify_date, GETDATE()) < 7