How to call a sql script from another in SQL Server 2005 - sql-server

I am in need of testing several different processes for the application we're builduing. Each process requires a particular table in our database to have data and all of these tables have foreign key constraints from other tables as well.
I've written sql scripts that populate the table I'm interested in as well as its dependencies but, it turns out that in a few of these scripts I've duplicated a lot of code when populating the dependencies tables.
I would like to take out the duplicated code and put it in a separate script but I don't know how, if possible, to execute a sql script from within another one.
An important part of all of this would also be to be able to get the ##IDENTITY value in the calling script from the called one.
Any help will be greately appreciated.
Best regards.
Clarification: By script I mean a file saved in disk. I don't want to be creating and deleting temporary stored procedures for this.

When I hear the word "script", I think of a file containing a series of commands; if you're asking how to get SQL Server to load a file of commands from another file of commands, I'm not sure of an easy way to do that.
If you can save your duplicate code as a stored procedure, you can certainly call a stored procedure from another stored procedure within SQL Server. You could then pass in a parameter holding the ##IDENTITY value (and you may want to look at SCOPE_IDENTITY() instead).
HTH,
Stu

Related

SQL Server stored procedure script

I am wondering if I can modify a generated script to copy data from 1 database to another. I used the "Generate scripts" tool and can I then take the latest run each time and create a stored procedure to then take latest tables and insert that into new database?
Source tables have an extension of _dateinfo. So table called ABC_05162016 would be on next run table ABC_06162016. Is there a way to take the original generated script and then update that last part 1 time instead of constantly selecting all the tables every time this is needed to be done?
Thanks for the help. Yes I have looked around for an answer to this but have not come across something like this. All deal with importing/exporting data or using Generated Script part. Thanks for the help.
If there is a better way then those 3 ways. I would appreciate knowing that as well.
Using SQL Server 2008.

execute rss script on several servers using SSIS, storing results in a table

I found a wonderful script that collects all the (shared) datasources used on a reportserver:
LINK
I simply love this script.
However, I am looking for a way to execute this script on several reportservers and add the results to a centralised table. That way my colleagues and me would be able to see pretty quickly what datasources are used.
I could place this script on each reportserver, collect the csv's on a central server and then use SSIS to insert them into a MSSQL table. That way I would have a nice central overview of all the used datasources.
However, I would prefer to have the script in one location and then execute that script on a list of servers.
Something like:
Loop through table with servers
execute script (see link)
insert resulting csv into central table (preferably skip this step, have script insert data in table directly)
next server
Any suggestions as to what the best approach would be? Should it be a webservicetask? Scripttask?
Something else completeley?
The level of scripting in the mentioned script is right at the edge of what I understand, so if someone would know how to adapt the script in such a way that I could use it as input in a dataflow in SSIS I would be very happy.
Thanks for thinking with me,
Henro
This script is called using a utility called rs.exe so you would use an execute process task to call it. To avoid writing to a file, you could modify the script and have it insert the results into a table. The package could be set up as follows:
Create a foreach loop which iterates over a list or ado.net recordset of your servers
Put the server name in a variable
Create a variable for the arguments for the process task, referencing the server variable from step 2
Add a process task which uses the above argument and calls rs.exe

Common function / stored procedures for all databases

We have a database server and it has about 10 databases.
I would like to create some functions / stored procedures which can be used in all databases.
For example, we can use sp_executesql in any database.
We have some requirements like that (getting current academic year, financial year, etc...)
Is it doable?
As others have suggested, you could put objects into the master database, but Microsoft explicitly recommends that you should not do that. I find that solution to be rather risky anyway, because the master database is 'owned' by the system, not by you, so there are no guarantees that it will continue to behave in the same way in the future.
Instead, I would consider this to be primarily a deployment issue. There are (at least) two strategies you could use:
Deploy the objects to every database
Deploy them to one 'reference' database that is only used for shared objects and create synonyms in the other databases
The second option is perhaps the better one, because if your functions use tables (e.g. you use a calendar table to get the academic year, which is much easier than calculating it) then you would have to create the same tables in every database too. By using synonyms, you only have to maintain one set of tables.
For the actual deployment, it's straightforward to use scripting to do manage the objects, because you just need a list of databases to connect to and run each DDL script against. You can do that using batch files and SQLCMD (perhaps with SQLCMD variables in your .sql scripts), or drive it from PowerShell or any other language that you prefer.
Depending upon what the SP actually does, you want to create the procedure in master, name it with sp_ and mark it as a system procedure:
http://weblogs.sqlteam.com/mladenp/archive/2007/01/18/58287.aspx
A couple of options:
You can use a system stored procedure as Cade says. I've done this in the past and it works ok. One warning on this is that the sp_MS_marksystemobject procedure is undocumented, which may mean that it could vanish or change without warning in future SQL versions. Thinking back I think there were other problems using this approach with functions though.
Another approach is to use standardized procedure and functions, and roll them out across your databases using sp_MSforeachdb to run code against every database. If you need to run against only your 10 database you can take copy the code in this procedure and modify it to check that a database matches your schema before running the code (or you can write your own version that does a similar thing).

Creating a New Database from Within a Stored Procedure

Due to an employee quitting, I've been given a project that is outside my area of expertise.
I have a product where each customer will have their own copy of a database. The UI for creating the database (licensing, basic info collection, etc) is being outsourced, so I was hoping to just have a single stored procedure they can call, providing a few parameters, and have the SP create the database. I have a script for creating the database, but I'm not sure the best way to actually execute the script.
From what I've found, this seems to be outside the scope of what a SP easily can do. Is there any sort of "best practice" for handling this sort of program flow?
Generally speaking, SQL scripts - both DML and DDL - are what you use for database creation and population. SQL Server has a command line interface called SQLCMD that these scripts can be run through - here's a link to the MSDN tutorial.
Assuming there's no customization to the tables or columns involved, you could get away with using either attach/reattach or backup/restore. These would require that a baseline database exist - no customer data. Then you use either of the methods mentioned to capture the database as-is. Backup/restore is preferrable because attach/reattach requires the database to be offline. But users need to be sync'd before they can access the database.
If you got the script to create database, it is easy for them to use it within their program. Do you have any specific pre-requisite to create the database & set permissions accordingly, you can wrap up all the scripts within 1 script file to execute.

Stored Procedures MSSQL2005

If you have a lot of Stored Procedures and you change the name of a column of a table, is there a way to check which Stored Procedures won't work any longer?
Update: I've read some of the answers and it's clear to me that there's is no easy way to do this. Would it be easier to move away from Stored Procedures?
I'm a big fan of SysComments for this:
SELECT DISTINCT Object_Name(ID)
FROM SysComments
WHERE text LIKE '%Table%'
AND text LIKE '%Column%'
There's a book-style answer to this, and a real-world answer.
First, for the book answer, you can use sp_depends to see what other stored procs reference the table (not the individual column) and then examine those to see if they reference the table:
http://msdn.microsoft.com/en-us/library/ms189487.aspx
The real-world answer, though, is that it doesn't work in a lot of cases:
Dynamic SQL strings: if you're building strings dynamically, either in a stored proc or in your application code, and then executing that string, SQL Server has no way of knowing what your code is doing. You may have the column name hard-coded in your code, and that'll break.
Embedded T-SQL code: if you've got code in your application (not in SQL Server) then nothing in the SQL Server side will detect it.
Another option is to use SQL Server Profiler to capture a trace of all activity on the server, then search through the captured queries for the field name you want. It's not a good idea on a production server, because the profile incurs some overhead, but it does work - most of the time. Where it will break is if your application does a "SELECT *", and then in your application, you're expecting a specific field name to come back as part of that result set.
You're probably beginning to get the picture that there's no simple, straightforward way to do this.
While this will take the most work, the best way to ensure that everything works is to write integration tests.
Integration tests are just like unit tests, except in this case they would integrate with the database. It would take some effort, but you could easily write tests that exercise each stored procedure to ensure it executes w/o error.
In the simplest case it would just execute the sp and make sure there is no error and not be concerned about the actual results. If your tests just executed sp's w/o checking results you could write a lot of this genericly.
To do this you would need a database to execute against. While you could setup the database and deploy your stored procs manually, the best way would be to use continuous integration to automatically get the latest code (database DDL, stored procs, tests) from your source control system, build your database, and execute your tests. This would happen every time you committed changes to source control.
Yes it seems like a lot of work. It's a lot of work, but the payoff is also big. The ability to ensure that your changes don't break anything allows you to move your product forward faster with a better quality.
Take a look at NUnit and NDbUnit
I'm sure there are more elegant ways to address this, but if the database isn't too complex, here's a quick and dirty way:
Select all the sprocs and script to a query window.
Search for the old column name.
If you are only interested in finding the column usage in the stored procedure probably the best way will be do do a brute force search for the column name in the definition column sys.sql_modules table - which stores the definition for the stored procedures/functions.

Resources