Database cleanup - sql-server

I inherited a SQL server database that is not well formatted. ( some consulting company came in to do the project and left without completing it)
the main issues I have with this database are:
Data types: a lot of tinyint and text types.
Tables are not normalized: some of the keys are names instead of seq ids.
A lot of tables that I am not sure are being used
a lot of stored procedures that i am not sure are being used
Badly named tables and stored procs
I also inherited the asp.net application that runs against this database.
I would like to clean this database up. I understand that changing the datatypes will have to happen at each table. for getting rid of all the extra tables and stored procs. what is the easiest way to do so.
any other tips to make it cleaner and smaller is appreciated.
I want to also mention that I have RedGate tools installed.( if that helps).
Thank you

Check out the Sql Server Data Tools they allow to create a project from a live database. Some of the things you can do in there is right click 'Find Usages' for the tables, views and functions.
So long as the previous developer used stored procedures and views rather than querying directly, it should find references to your project that way, without killing your project.
Also, for finding stored procedures that are not used, put in some basic logging at the top of each stored procedure in your application, after X amount of days, those that haven't been logged in your table are likely safe to remove, else a tedious search through your .NET code will find them.

Related

Tool To Generate Data Migration Script From Table A To Table B In SQL Server

During our SQL Server database deployments, we create a temporary table which contains the new desired state of data for a particular table. We then merge the temp table into the target table (we actually use individual insert, update and delete statements, but that's probably not relevant). The inserts/updates/deletes performed are captured and written out to a log.
We would like to be able to report on what changes would be applied by a deployment, without actually applying them. This is currently done by rolling back the transaction at the end of the above process. This doesn't feel particularly great though.
Now what we are thinking of doing is, instead of performing the changes and rolling them back, we will generate a migration script for the table (generate some SQL code that performs the necessary inserts, updates and deletes). If we want to do the actual deployment, this code will be dynamically executed. If not, the code will just be printed to a log.
It shouldn't take long to put together some code which can generate migration scripts for two specified tables, but I first wanted to verify that there isn't already an existing tool which can do this?
Searching on Google, I can find lots of talk about migrating whole databases, but nothing about generating a data migration script to effectively merge one table into another.
So my question is, does anyone know of such a tool?
There are several data compare tools like:
SQL Data Compare from Red Gate
SQL Server Data Tools
dbForge Data Compare from Devart
Is that what you're looking for?

Is it possible to emit sql from grails/groovy newInstance/saves?

My team is looking into db migration tools (e.g., Flyway, Liquibase) and so I'm thinking about how to incorporate changes I make to the db contents using my groovy+grails service method. I'm not referring to changes to columns and/or tables (i.e., domain classes), I'm referring to inserts/updates of rows which represent configuration values for the associated webapp.
My service method is written to be used somewhat interactively. That is, when I'm adding or updating rows in various tables (i.e., newInstance or save), it helps me navigate various db constraints and to make sure all the foreign keys and my own business logic are set correctly. I run it repeatedly (rolling back each time afterwards using setRollbackOnly()) until I've found something I'm happy with. The method is written in groovy, and I don't want to rewrite it in sql.
Is there a way to get groovy/grails to emit the sql it would execute instead of executing the sql? That is, give me something I could copy/paste into a Flyway migration or Liquibase changeset?
I looked into logging, but I'd have to somehow process that output to substitute the values in and to get the proper column names, and even then I'd need a distinction between lines that I actually change the db (maybe I could just extract the inserts and updates). I also looked into these
grails database migration scripts, but they appear to either look at domain classes (which isn't where my changes are happening) or at the entire database (which would sweep up a lot of user data too).
Thanks!

Common function / stored procedures for all databases

We have a database server and it has about 10 databases.
I would like to create some functions / stored procedures which can be used in all databases.
For example, we can use sp_executesql in any database.
We have some requirements like that (getting current academic year, financial year, etc...)
Is it doable?
As others have suggested, you could put objects into the master database, but Microsoft explicitly recommends that you should not do that. I find that solution to be rather risky anyway, because the master database is 'owned' by the system, not by you, so there are no guarantees that it will continue to behave in the same way in the future.
Instead, I would consider this to be primarily a deployment issue. There are (at least) two strategies you could use:
Deploy the objects to every database
Deploy them to one 'reference' database that is only used for shared objects and create synonyms in the other databases
The second option is perhaps the better one, because if your functions use tables (e.g. you use a calendar table to get the academic year, which is much easier than calculating it) then you would have to create the same tables in every database too. By using synonyms, you only have to maintain one set of tables.
For the actual deployment, it's straightforward to use scripting to do manage the objects, because you just need a list of databases to connect to and run each DDL script against. You can do that using batch files and SQLCMD (perhaps with SQLCMD variables in your .sql scripts), or drive it from PowerShell or any other language that you prefer.
Depending upon what the SP actually does, you want to create the procedure in master, name it with sp_ and mark it as a system procedure:
http://weblogs.sqlteam.com/mladenp/archive/2007/01/18/58287.aspx
A couple of options:
You can use a system stored procedure as Cade says. I've done this in the past and it works ok. One warning on this is that the sp_MS_marksystemobject procedure is undocumented, which may mean that it could vanish or change without warning in future SQL versions. Thinking back I think there were other problems using this approach with functions though.
Another approach is to use standardized procedure and functions, and roll them out across your databases using sp_MSforeachdb to run code against every database. If you need to run against only your 10 database you can take copy the code in this procedure and modify it to check that a database matches your schema before running the code (or you can write your own version that does a similar thing).

How to separate user-defined messages for different databases

I have 2 databases, each of them has its own user-defined messages. The problem is that their message_id's stored in sys.messages intersect so these DB's cannot be deployed on the same SQL Server instance without changing all messages in one database (but these is too expensive - I have to change ALL stored procedures).
Is there any way to make error messages specific to database?
The sys.message table is database-wide so you simply cannot do that easily. That table really should only be used for things that are DB-wide (like extended stored procedures and other server extensions), not for stored procedure. But I guess changing that is already too late.
I see only two ways around it:
Use the "language" field when adding the message to identify not the message but the application. Of course, that will cause additional problems as you'll need to have each application use it's own "language".
Use two different instances of SQL server, one for each app.

Linking tables between databases

I’m after a bit of advice on the best way to go about this is SQL server 2008R2 express. I have a number of applications that are in separate databases on the same server. They are all “plugins” that use a central staff/structure list that will be in a separate database. The application is in the process of being migrated from JET.
What I’m looking for is the best way of all the “plugin” databases being able to see the central database and use those tables in standard queries and views etc.
As I’m using express that rules out any replication solution and so far the only option I can think of is to use triggers or a stored procedure to “push” out all the changes to the plugins. The information needs to be populated on a near enough real time basis however the number of changes will be very small maybe up to 100 a day and the biggest table only has about 1000 rows at the moment (the staff names table).
Hopefully that will cover all everything but if anyone needs any more details then just ask
Thanks
Apologies if I've misunderstood, but from your description it sounds like all these databases are hosted on the same instance of SQL Server - it's your mention of replication that makes me uncertain.
Assuming that's the case, you should be able to replace any copies of tables from the central database which are held in the "plugin" databases with views or synonyms which reference the central tables directly, since SQL server allows you to make references between databases on the same server using three-part naming (database_name.schema_name.object_name)
For example, if each plugin db has a table StaffNames, you could replace this with a view by dropping the table, then creating a view:
drop table StaffNames
go
create view StaffNames
as
select * from <centraldbname>.<schema - probably dbo>.StaffNames
go
and your code should continue to work seamlessly, as long as permissions are set up.
Alternatively, you could replace all the references to the shared tables in the plugin databases with three-part name references to the central database, but the view method requires less work.

Resources