Stored Procedures MSSQL2005 - sql-server

If you have a lot of Stored Procedures and you change the name of a column of a table, is there a way to check which Stored Procedures won't work any longer?
Update: I've read some of the answers and it's clear to me that there's is no easy way to do this. Would it be easier to move away from Stored Procedures?

I'm a big fan of SysComments for this:
SELECT DISTINCT Object_Name(ID)
FROM SysComments
WHERE text LIKE '%Table%'
AND text LIKE '%Column%'

There's a book-style answer to this, and a real-world answer.
First, for the book answer, you can use sp_depends to see what other stored procs reference the table (not the individual column) and then examine those to see if they reference the table:
http://msdn.microsoft.com/en-us/library/ms189487.aspx
The real-world answer, though, is that it doesn't work in a lot of cases:
Dynamic SQL strings: if you're building strings dynamically, either in a stored proc or in your application code, and then executing that string, SQL Server has no way of knowing what your code is doing. You may have the column name hard-coded in your code, and that'll break.
Embedded T-SQL code: if you've got code in your application (not in SQL Server) then nothing in the SQL Server side will detect it.
Another option is to use SQL Server Profiler to capture a trace of all activity on the server, then search through the captured queries for the field name you want. It's not a good idea on a production server, because the profile incurs some overhead, but it does work - most of the time. Where it will break is if your application does a "SELECT *", and then in your application, you're expecting a specific field name to come back as part of that result set.
You're probably beginning to get the picture that there's no simple, straightforward way to do this.

While this will take the most work, the best way to ensure that everything works is to write integration tests.
Integration tests are just like unit tests, except in this case they would integrate with the database. It would take some effort, but you could easily write tests that exercise each stored procedure to ensure it executes w/o error.
In the simplest case it would just execute the sp and make sure there is no error and not be concerned about the actual results. If your tests just executed sp's w/o checking results you could write a lot of this genericly.
To do this you would need a database to execute against. While you could setup the database and deploy your stored procs manually, the best way would be to use continuous integration to automatically get the latest code (database DDL, stored procs, tests) from your source control system, build your database, and execute your tests. This would happen every time you committed changes to source control.
Yes it seems like a lot of work. It's a lot of work, but the payoff is also big. The ability to ensure that your changes don't break anything allows you to move your product forward faster with a better quality.
Take a look at NUnit and NDbUnit

I'm sure there are more elegant ways to address this, but if the database isn't too complex, here's a quick and dirty way:
Select all the sprocs and script to a query window.
Search for the old column name.

If you are only interested in finding the column usage in the stored procedure probably the best way will be do do a brute force search for the column name in the definition column sys.sql_modules table - which stores the definition for the stored procedures/functions.

Related

Update Multiple SSRS Reports in bulk

I need to make identical changes to hundreds of reports, and I was hoping to do this via SQL instead of each indvidual report and it's query. I can extract the report query via xml and generate my list of reports, their location, and the query being used. But what I cannot figure out is how to update the report query and then get that updated back into the Catalog? database so that the report itself reflects the changes when executed? I have never seen where this is possible, but maybe someone on here has tried to do this or knows that it's flat out not possible.
I could use SSIS and do this, but I would prefer not to download all the RDLs and then update, and then redeploy/upload the reports. Was hoping to update in place the reports/RDLs.
You shouldn't have to download the RDLs, they should already be in your source control system, and ideally collected and grouped into project(s). If so, you are in luck - you can use the global search/replace capabilities of Visual Studio (BIDS) or Notepad++ to make your change.
If your change was to the structure of the report then you could simply write a quick nasty console app to load the RDL and manipulate the XML structure. But things like the report query are held as free-form text in a node, making it harder to apply mass updates in a reliable way.
You could look to refactor the report queries into stored procedures and/or functions, this will make future updates a bit easier. In any case if you change the report RDLs you've got no option but to republish the modified ones - there's no such thing as an in-place change on the server (having your queries as stored procedures would have avoided this issue).

Unit testing results between several stored procedures?

I need to unit test results between several stored procedures on a single database (certain values between different results sets ought to be the same). Also, I need to be able to copy these unit test such that several identical databases will perform the unit tests identically when I choose to start the tests.
I want to use OpenRowSet to dump these results to temp tables and then compare these tables, possibly using a stored procedure that I can execute once a week.
Before I configure the servers to allow this are there any reasons not to use OpenRowSet? If so then what other options might I have?
The main reason to not use OpenRowSet is that you don't need to use it. Since you want to do testing, you should use a testing framework. I am a huge fan of DbFit ( http://dbfit.github.io/dbfit/ ). Your tests are completely isolated from your database. It is very easy to set up and modify. And you can even compare result sets between two Stored Procedures. It is very easy to automate. It is easy to create subsections and only run tests in a particular subsection, or an individual test. You can stage the test with DML statements and everything will get rolled back at the end of the test. You can use variables to grab data from a query or procedure and use that in calls / queries that follow.
Well... Perhaps using a unit testing framework is a step too far for you. If you don't want to go that far, try the below.
From MSDN OpenRowset "is an alternative to accessing tables in a linked server and is a one-time, ad hoc method of connecting and accessing remote data by using OLE DB". You have stated that there are several databases on the same server (i.e. no linked server). Therefore OpenRowSet seems to be overkill. You can still get the bulk performance gains by using "Select Into" statements to create your data tables in a new unit testing database (I wouldn't advokate creating test tables in your prod databases). This would have a stored proc that calls each individual database using 3 or 4 part naming. If you really wanted to you could have a table of database\stored procedures and use dynamic SQL to execute them all. once you have all the data your stored proc just needs to compare them.
Another way would be introduce a kind of ReturnType in your stored
procedures, or use an existing parameter to send that value.
When the ReturnType is set to 'INSERT_RESULTS_TO_TEST' or some such,
then have the stored procedures' final return statements, insert
records into test tables designed for testing, instead of their
default return.
If necessary, have additional columns in the test tables to indicate,
which server, which database, which stored procedure is producing the
result. Say call these ResultSetID.
Then, for your comparison, use self-joins on the test tables
comparing for the values between different ResultSetIDs.

How to add a column to a table from within lightswitch

I have a SQL Server Database and it is a requirement for my lightswitch app that the administrator be able to add new columns to certain tables. Is that even possible? The only way I could think to do it is to write an "ALTER" stored procedure in the database and call it from lightswitch, but that seems a little messy. Any Ideas?
Although you'll be able to find a way to physically add a new column to a table after an application has been published, LightSwitch is not going to like it. You may even find that the application refuses to run.
For an attached database, the model that LightSwitch creates for it can only be updated by running the Update Data Source command, which can only be done by the developer at design-time. And if the database in the intrinsic database, it also can only be changed at design-time.
So the short answer to "Is that even possible?" is "no".
An ALTER stored procedure would probably be the best way to achieve what you are talking about, but I wouldn't recommend it.
How are you then going to store and retrieve data from these columns? What happens when you start to get column name collisions between tables?
It might be better if you give us a higher level description of what you are trying to achieve, but taking a guess I would suggest you look at the entity-attribute-value pattern for storing arbitrary user data.

Database cleanup

I inherited a SQL server database that is not well formatted. ( some consulting company came in to do the project and left without completing it)
the main issues I have with this database are:
Data types: a lot of tinyint and text types.
Tables are not normalized: some of the keys are names instead of seq ids.
A lot of tables that I am not sure are being used
a lot of stored procedures that i am not sure are being used
Badly named tables and stored procs
I also inherited the asp.net application that runs against this database.
I would like to clean this database up. I understand that changing the datatypes will have to happen at each table. for getting rid of all the extra tables and stored procs. what is the easiest way to do so.
any other tips to make it cleaner and smaller is appreciated.
I want to also mention that I have RedGate tools installed.( if that helps).
Thank you
Check out the Sql Server Data Tools they allow to create a project from a live database. Some of the things you can do in there is right click 'Find Usages' for the tables, views and functions.
So long as the previous developer used stored procedures and views rather than querying directly, it should find references to your project that way, without killing your project.
Also, for finding stored procedures that are not used, put in some basic logging at the top of each stored procedure in your application, after X amount of days, those that haven't been logged in your table are likely safe to remove, else a tedious search through your .NET code will find them.

Common function / stored procedures for all databases

We have a database server and it has about 10 databases.
I would like to create some functions / stored procedures which can be used in all databases.
For example, we can use sp_executesql in any database.
We have some requirements like that (getting current academic year, financial year, etc...)
Is it doable?
As others have suggested, you could put objects into the master database, but Microsoft explicitly recommends that you should not do that. I find that solution to be rather risky anyway, because the master database is 'owned' by the system, not by you, so there are no guarantees that it will continue to behave in the same way in the future.
Instead, I would consider this to be primarily a deployment issue. There are (at least) two strategies you could use:
Deploy the objects to every database
Deploy them to one 'reference' database that is only used for shared objects and create synonyms in the other databases
The second option is perhaps the better one, because if your functions use tables (e.g. you use a calendar table to get the academic year, which is much easier than calculating it) then you would have to create the same tables in every database too. By using synonyms, you only have to maintain one set of tables.
For the actual deployment, it's straightforward to use scripting to do manage the objects, because you just need a list of databases to connect to and run each DDL script against. You can do that using batch files and SQLCMD (perhaps with SQLCMD variables in your .sql scripts), or drive it from PowerShell or any other language that you prefer.
Depending upon what the SP actually does, you want to create the procedure in master, name it with sp_ and mark it as a system procedure:
http://weblogs.sqlteam.com/mladenp/archive/2007/01/18/58287.aspx
A couple of options:
You can use a system stored procedure as Cade says. I've done this in the past and it works ok. One warning on this is that the sp_MS_marksystemobject procedure is undocumented, which may mean that it could vanish or change without warning in future SQL versions. Thinking back I think there were other problems using this approach with functions though.
Another approach is to use standardized procedure and functions, and roll them out across your databases using sp_MSforeachdb to run code against every database. If you need to run against only your 10 database you can take copy the code in this procedure and modify it to check that a database matches your schema before running the code (or you can write your own version that does a similar thing).

Resources