I came to know about some uses of undocumented stored procedures (such as 'sp_MSforeachdb' ...etc) in MS SQL.
What are they? and Why they are 'undocumented'?
+1 on precipitous. These procs are generally used by replication or the management tools, they are undocumented as the dev team reserves the right to change them any time. Many have changed over the years, especially in SQL 2000 Sp3 and SQL 2005
My speculation would be because they are used internally and not supported, and might change. sp_who2 is another that I find very handy. Maybe management studio activity monitor uses that one - same output. Did I mention undocumented probably means unsupported? Don't depend on these to stick around or produce the same results next year.
sp_MSforeachdb and sp_MSforeachtable are unlikely to change.
Both can be used like this:
EXEC sp_MSforeachtable "print '?'; DBCC DBREINDEX ('?')"
where the question mark '?' is replaced by the tablename (or DB name in the other sp).
Undocumented means unsupported and that MS reserves the right to change or remove said commands at any time without any notice whatsoever.
Any documented features go through two deprecation stages before they are removed. An undocumented command can be removed, even in a service pack or hotfix, without any warning or any announcements.
It is most likely that one of the internal SQl Server developers needed these stored procedures to implement the functionality that they were working on, so they developed it and used it in their code. When working with the technical documentation people they covered the scope of their project, and included in the official documentation only the portion of their project that applied to the customer. Over time, people found the extra stored procedures (because you can't hide them) and started using them. While the internal SQl Server developers wouldn't want to change these undocumented procedures, I'm sure they would in two seconds if they had to to for their next project.
As others have said, they are unsupported features that are not intended for general consumption, although they can't stop you from having a go, and indeed, sometimes they can be very useful.
But as internal code, they might have unexpected side-effects or limitations, and may well be here one day and gone the next.
Use them carefully if you wish, but don't rely on them entirely.
Related
We are looking to implement unit tests using the tSQLt test framework. It has got a pre-requisite that the SQL CLR must be enabled using this command:
EXEC sp_configure 'clr enabled', 1; RECONFIGURE;
I am curious to know what is the purpose of SQL CLR and the risks of enabling this in production environment?
PURPOSE
SQLCLR allows one to do things that either:
can't be done in T-SQL, or
can't be done as efficiently as in T-SQL
There are plenty of things that can be done in both, and for which T-SQL is actually much better at. In those cases it is an inappropriate use of SQLCLR to do those things so it is best to research first to make sure that the operation cannot be done in T-SQL, or would definitely be slower.
For example of performance, T-SQL Scalar UDFs prevent parallel execution plans. But SQLCLR scalar UDFs, as long as there is no data access and that they are marked as IsDeterministic=true, do not prevent parallel execution plans.
For more details on what SQLCLR is and is not, please see the first article in the Stairway to SQLCLR series that I am writing for SQL Server Central:
Stairway to SQLCLR Level 1: What is SQLCLR?
Or, to get a sense of what can be done in SQLCLR, please see my SQL# project, which is a library of over 320 stored procedures and functions, many of which are in the Free version, and many of which work in SAFE mode: SQLsharp.com.
RISKS
The risks vary based on the PERMISSION_SET (i.e. SAFE, EXTERNAL_ACCESS, and UNSAFE) that the Assembly is marked as, and what is being done. It is possible to do things in an UNSAFE Assembly that cannot be done in regular T-SQL (except that many of those dangerous things can already be done via some extended stored procedures, xp_cmdshell, and the OLE Automatic procedures -- sp_OA* ). An Assembly marked as SAFE cannot reach outside of the database, so generally quite safe, BUT you can still lock up the system via a Regular Expression that exposes "catastrophic backtracking" (of course, this can be mitigated starting in .NET Framework 4.5, so SQL Server 2012 and newer, by setting a max time limit on the RegEx operation). An Assembly marked as UNSAFE can write to static variables, which in the context of the shared App Domain model used by SQLCLR, allows for shared memory between Sessions. This can allow for caching, but when not used properly, easily leads to race conditions.
TESTING
As for tSQLt, I do not believe that you are required to use the SQLCLR component. I thought I saw that it just enabled some extended functionality. Either way, the source code is available on GitHub so you can check it out to see what it is doing. It has been a while since I looked at it, but from what I remember, it should not present much of a risk for the little that it is doing (especially in a Dev / QA environment).
Another option that doesn't use SQLCLR is DbFit. I have always prefered DbFit as it is completely external to the DB. It is based on the FitNesse framework, written in Java, and you manage the tests via wiki-style pages. It, by default, wraps the tests in a Transaction and rolls everything back when the test is finished (i.e. clean-up). It is worth taking a look at.
Download: DbFit project on GitHub
Tutorial: Using the DbFit Framework for Data Warehouse Regression Testing
SQLCLR allows you to create .NET assemblies and run code inside them from within SQL Server.
Depending on the permissions on the assembly the risks vary. The risks are something like so:
Permission Set: Risk
SAFE You cannot do anything more than what you can in T-SQL. So fairly safe.
EXTERNAL ACCESS You can call code in .NET assemblies approved by Microsoft, such as ADO.NET. Fairly safe, but still a risk.
UNSAFE You can do almost anything that the .NET framework allows you to do. In reality, shoot yourself in the head unless you know what you are doing.
I am working on some stored procedures at a client site.
To debug stored procedures using SSMS, it seems you must be a member of the [sysadmin] group. However, they only have PROD and TEST database instances, and the DBA will not grant me these permissions.
According to him, using inline PRINT statements is considered to be just as good as the ability to debug. That doesn't seem quite right to me so I was thinking of escalating my request, but thought I'd first ask here, is this common sentiment in the industry (that the ability to debug is "not necessary")?
Though bit opinion based question but yes I have always used debug like that ... that is get the procedure body and run each code block separately to find where the problem is and fix accordingly.
You might also want to consider the execution plan of the procedure which will help in debugging.
In my experience, yes.
I once found the debugging functionality very useful when I was transitioning from being a ASP developer to a SQL developer, but as I've become more comfortable with SQL, I find that I can do all my debugging with PRINT statements.
Also in addition to being a sysadmin, did you know that you have to be running the debugger locally on the SQL server to use it? It's a lot more trouble than it's worth.
We're migrationg from an old version of MS SQL Server to a much more recent version. The trouble is, we have too many equally old stored procedures that won't run in the new version, in particular, many use the "*=" notation instead of "LEFT JOIN" or RIGHT JOIN. Upgrading them on-hand would be extremely error-prone and time consuming (just one of the databases I'm in charge of has 900+ SP, and I've yet to check the other four!), so I'm wondering if there's any software out there that can aupgrade all these procedures. Any help would be greatly appreciated!
You don't mention the target version of SQL server, but I believe all of the recent version have a corresponding SQL Upgrade Advisor. I believe this tool will be useful as it will identify all of the places you use the obsolete syntax.
Assuming you only have server-side code (and no dynamic sql) this would be useful.
However, I don't think you will find any tool that can do this automatically for all cases -- the old style syntax was at times ambiguous and there are other possible problems.
You might find this article useful though as it shows how you can use SMSS to convert an old-style join. It is not perfect, and it does not always yield the highest performance equivalent join, but it may be better than nothing.
I never used the old-style joins myself on MS as my first version of MS-SQL was 7.0.
I have one question in order to speed up SQL Server 2000.
I want to use caching mechanism, but I don't know how to use.
I found some articles about it, but can you give an example for how to use.
For example:
there is a stored procedure - sp_stackOverFlow - it executes when every user enter to the program/web site and it is clear it makes slower running.
Is there a way of caching sp_stackOverFlow in every 2 minutes or another?
Your question isn't clear, not least because it isn't obvious what the stored procedure does. If the results are different for every execution and/or user then they cannot easily be cached anyway.
But more fundamentally, "I have a slow stored procedure" does not automatically mean "I need caching"; the database engine itself already caches data when it can. You need to understand why the stored procedure is running slowly: underpowered hardware, poor TSQL code, poor data model design and poor indexing are all very common issues that have major effects on performance.
You can find a lot of information on this site and by Googling about how to troubleshoot slow execution times for procedures, but you can start by reviewing the execution plan for the procedure in Query Analyzer and tracing the execution using Profiler. That will immediately tell you which statements are taking the most time, if there are table scans happening etc.
Because performance troubleshooting is potentially complex, if you need more assistance please post short, specific questions about individual issues. If the code for your stored procedure is very short (< 30 lines formatted) people may be willing to comment on it directly, otherwise it would be better to post only the individual SQL statements that are causing a problem.
Finally, mainstream support for MSSQL 2000 stopped 3 years ago, so you should definitely look into upgrading to newer version. The performance tools in newer versions will make resolving your issue much easier.
I've been consulting google for some time now, but it hasn't been able to give me a satisfactory answer...
In a SQL Server 2005 trace, I've got lots of "exec sp_execute" statements. I know they are connected to a corresponding "exec sp_prepare" statement which specifies the actual SQL.
But...
One: Is it possible to find out the SQL behind the sp_execute without finding the sp_prepare?
Two: What type of construct would typically hide behind the sp_execute? That is, is it a stored procedure? Is it just a string in code? Or what?
Three: Should I fear bad performance seeing these in the trace?
Any input is appreciated
Use
select * from sys.dm_exec_query_plan(PlanHandle)
to generate an xml document that indicates what sql sp_execute is using.
Those are API Server Cursors, most probably used by an old (or not so old, but badly developed) application.
99% of the times, cursors affect performance on your server. Disk and network I/O are the potential victims.
Read this, it helped me understanding how server side cursors work.
Late answer, but I recently had an application with bad performance executing sp_prepare and sp_execute.
One: Answered before
Two: It could be anything, stored procedures, any valid sql query basically.
Three: I had problems with SQL Server failing to generate good execution plans when the application was using sp_prepare. Basically, SQL Server analyzes the incoming parameters to generate a good execution plan, but with sp_prepare no values for the parameters are supplied, since they are only added when executing sp_execute. So in the mean time, SQL Server applies generic costs for different operators and might very well generate a suboptimal plan.
If you look at your reads/cpu-usage for your traces, you should be able to determine if your queries are behaving badly or as expected.
Also see http://blogs.infosupport.com/speeding-up-ssis-literally