Best practice deploying Stored Procedures to multi DB with same structure? - database

We have one website currently being used by 3 clients. We have 3 different versions of the same source code which is calling 3 different databases. So,
1) Client A access "http://custA.weblink.com" will access "CustA" Database
2) Client B access "http://custB.weblink.com" will access "CustB" Database
3) Client C access "http://custC.weblink.com" will access "CustC" Database
All databases have the same structure, table design and stored procedures. Only the data is different.
The issue is, when I need to do deployment for stored procedures, I need to repeat the backup and perform deployment 3 times. It's not really hard even when there are lots of stored procedures need to be deployed but doesn't seems like a good practice.
Now I only have 3 clients, what if in the future I have 10? I need to repeat backup and deployment 10 times which is time consuming and it's hard to guarantee that all stored procedures in all databases will always be the same.
In this type of case where I have existing multi applications and databases, what could be the good practice or measurements to take to make the situation better? I don't think my company will allow making huge changes like merging all clients data into one database and re-write application flow to get the right data.
I thought about creating one main database without any data. All the Stored Procedures script will be deployed there. And each of the existing "CustA", "CustB" and "CustC" DB, I will use to EXEC method to call Stored Procedure from main database to process the data in the relevant DB. Like this:
1) Main database
USE [MainDatabase]
ALTER PROCEDURE USP_GetCustomerById
#CustId BIGINT
SELECT * FROM [Customer] WHERE Id = #CustId
2) CustA database (Same flow for CustB and CustC database)
USE [CustA]
ALTER PROCEDURE USP_GetCustomerById
#CustId BIGINT
EXEC MainDatabase.dbo.USP_GetCustomerById #CustId
Will there be any impact if I do so?

Have you ever considered to use visual studio to create an SQL Server Database Project? Here you can import for instance the first server ClientA settings for a single Database. It will import (schemas,objects,views,indexes and so on) And then you can setup different deployment servers. You can also compare Source and destination (A and B) to each other, to see if you have differences. (I
Example on deploying on Database to multiple servers
As you can see in my picture i have the whole structure for one database. In the bottom of the Solution Explorer you can see i have something called PROD and TEST. This is actually two different servers. You could create the 3 servers you need and then you can just press deploy.
Example on Schema compare
Here i have compared a source with my project. Then i can import my changes i made on ClientA and inject them to my project, so i can deploy them to the other servers.

Related

Copy records from a table on one SQL instance to an identical table on a different SQL instance

We had an intern who was given written instructions for deleting old data from a database based on dates (from within our ERP system). They were fascinated by the results and just kept deleting instead of stopping at the required date. There are now 4 years of missing records in the production database. I have these records in my development database, which is in a different instance on a different server. Is there a way to transfer just those 4 years worth of data from my development database to my production database, checking, of course, to make sure there are no duplicates (unique index on transaction number).
I haven't tried anything yet because I'm not sure where to start. I do have a test database on the same instance as the production database that I could use to test the transfer with.
There are several ways to do this. Assuming that this is on a different machine, you will want to create a Linked Server on your dev machine to link to the target server (Or, technically, a link from the production server to your dev machine could be used as well). Then, perform an insert of the selected records from the source to the target.
More efficiently, you can use the Export Data functionality. Right click on the database (Not the server / instance, but the database) and select Tasks / Export Data from the popup menu. This will pop up the SQL Server Import and Export Wizard. Use your query above to select the data for export.
If security considerations interfere with this, create a duplicate of the table(s) with alternate names (e.g. MyInvRecords) in a new database, and export the data into those tables. Back up that DB, transfer it to someplace accessible from the target server, restore that DB, then transfer the rows back into the original DB.
I haven't had to use anything but these methods before, so one of them should work for you.
A basic insert will work just fine.
Insert ProdDB.schema.YourTable
([Columns])
select ([Columns])
from TestDB.schema.YourTable
where YourDateRange predicates here

Same app for multiple clients and custom stored procedures in SQL Server

We have the same app source (with some custom "if x client then") and basically the same SQL Server database structure. But, some clients need slightly different stored procedures.
What would be best practice in this scenario for long term maintaining the databases and keeping the correct structure? As of now, for example when I change the procedure in one database and need to do the same in 9/10 others I just ALTER procedure and USE different database. But, I can't keep track of which procedures are different in that special snowflake client.
Any ideas? The plan is of course to get more clients so that's looking for trouble.
I try to push the "one fits all" concept but hey, what can you do...
Maybe have that "if x client then" as case statement in a SQL Server stored procedure and then you can just ALTER mindlessly?

Database cleanup

I inherited a SQL server database that is not well formatted. ( some consulting company came in to do the project and left without completing it)
the main issues I have with this database are:
Data types: a lot of tinyint and text types.
Tables are not normalized: some of the keys are names instead of seq ids.
A lot of tables that I am not sure are being used
a lot of stored procedures that i am not sure are being used
Badly named tables and stored procs
I also inherited the asp.net application that runs against this database.
I would like to clean this database up. I understand that changing the datatypes will have to happen at each table. for getting rid of all the extra tables and stored procs. what is the easiest way to do so.
any other tips to make it cleaner and smaller is appreciated.
I want to also mention that I have RedGate tools installed.( if that helps).
Thank you
Check out the Sql Server Data Tools they allow to create a project from a live database. Some of the things you can do in there is right click 'Find Usages' for the tables, views and functions.
So long as the previous developer used stored procedures and views rather than querying directly, it should find references to your project that way, without killing your project.
Also, for finding stored procedures that are not used, put in some basic logging at the top of each stored procedure in your application, after X amount of days, those that haven't been logged in your table are likely safe to remove, else a tedious search through your .NET code will find them.

How to deploy/replicate/copy a SQL Server 2005 database including CLR objects to multiple servers

We maintain a generic tools database at work that contains useful stored procedures, functions, and the odd lookup table. It also contains some CLR objects which need to be included.
We have around 10 servers and want to have a copy of this database available on each instance inclusding CLR functions..
I plan to nominate one DB as the master and then replicate/logship the db to other instances on a nightly basis. I would also like to be able to kick off the replication by hand if necessary.
Any advice on how this can be achieved?
Thanks : )
I kept this thread open in hopes that someone would come up with something clever. Here's what I came up with:
1) You could implement the log shipping you're talking about and create a read-only snapshot of the log shipping target database on the non-master servers. You'd drop and re-create the snapshot every day.
2) Back up to a location that's available to all the servers and just do a straight restore every day.

Linking tables between databases

I’m after a bit of advice on the best way to go about this is SQL server 2008R2 express. I have a number of applications that are in separate databases on the same server. They are all “plugins” that use a central staff/structure list that will be in a separate database. The application is in the process of being migrated from JET.
What I’m looking for is the best way of all the “plugin” databases being able to see the central database and use those tables in standard queries and views etc.
As I’m using express that rules out any replication solution and so far the only option I can think of is to use triggers or a stored procedure to “push” out all the changes to the plugins. The information needs to be populated on a near enough real time basis however the number of changes will be very small maybe up to 100 a day and the biggest table only has about 1000 rows at the moment (the staff names table).
Hopefully that will cover all everything but if anyone needs any more details then just ask
Thanks
Apologies if I've misunderstood, but from your description it sounds like all these databases are hosted on the same instance of SQL Server - it's your mention of replication that makes me uncertain.
Assuming that's the case, you should be able to replace any copies of tables from the central database which are held in the "plugin" databases with views or synonyms which reference the central tables directly, since SQL server allows you to make references between databases on the same server using three-part naming (database_name.schema_name.object_name)
For example, if each plugin db has a table StaffNames, you could replace this with a view by dropping the table, then creating a view:
drop table StaffNames
go
create view StaffNames
as
select * from <centraldbname>.<schema - probably dbo>.StaffNames
go
and your code should continue to work seamlessly, as long as permissions are set up.
Alternatively, you could replace all the references to the shared tables in the plugin databases with three-part name references to the central database, but the view method requires less work.

Resources