Run the same replication script on several databases - sql-server

We have several SQL Server 2000 databases (I know, we need to upgrade) that have the same structure and have them set up to replicate to another server. The problem is that whenever I have to change the structure (which is usually pretty easy to do on all databases, especially with tools from Red Gate) I have to stop the replication, make the structure changes, and then set up replication again. The steps to set up replication only take a few minutes to do for each database but it's repetitive and drives me crazy. I have the IDE create a script of the replication procedure and then just replace the name of the prior database with the name of the next database and run the script. Still annoying but faster than clicking though the IDE and forgetting an option.
I've tried things like using the "SP_MSFOREEACHDB" but that didn't look very promising.
My guess is I should just use the TSQL that gets generated from the IDE and use that as a starting point to build a new TSQL script have it pass the name of the database as a parameter. And then when something changes with the structure of the database I need to address that in the TSQL replication script and make the changes there. Is this an issue for anyone else? Does 2005 or 2008 have a better "SP_MSFOREACHDB" so I wouldn't have to mainting some crazy script and just have the IDE generate a script when there are changes that I could then use on multiple databases easily?

Are you using SQL Server replication? If so, why aren't you making the changes to the publishing database and letting it push out the schema changes to its subscribers? We do this occasionally on SQL 2005 and it works well for the most part; I don't have any experience with replication on 2000 servers.

If you can use SQL Server Management studio, then the SSMS tools pack has a widget to allow the same script to be run on different databases.

SQLCMD tool can connect to SQL Server 2000 and enable interaction from command line. Using parametrized queries and a fixed set of .bat files (one for each server) can be a good alternative for what you do now.

Related

How to solve System.OutOfMemoryException in SQL Server 2014

My scenario is my live database is on Azure database name vlproduction and I am using SQL Server 2014 on my local machine, database name testvlproduction. For some reason, my testvlproduction database was deleted.
I want to generate testvlproduction to be the same as vlproduction but I found there is no way to take direct backup of live so I generate script with data but script is too big (300mb) whenever I'm trying to run script on my local it shows
System.OutOfMemoryException
Please tell me, what to do to fix this ? or is there another way to generate database same as live on local
Is there already built any such functionality in SQL Server?
May be this question repeated but still I have no solution for my issue.
Fill free to ask any query.
Thanks
This is a limitation of sql server,This happen because Management Studio is running out of memory, not the SQL Server service.
This is likely caused by the size of the result set that you are returning to Management Studio.
See https://support.microsoft.com/en-in/kb/2874903for more details.
For more sql-server-management-studio-cant-handle-large-files
Since your SSMS runs out of memory you should try it with the a new release. It should be compatible with SQL Server 2014.
If you have some idle time on your production database take a db backup and restore it locally - use Export Data Tier Application and select the proper version where you want to restore. If this is not an option for you I'd suggest to take #mohan111 's suggestion of committing your script in batches.
I believe you have two problems:
System out of memory exception - This usually happens while running big scripts in SSMS. One work around is to run it using command prompt using sqlcmd command (see the link - https://msdn.microsoft.com/en-us/library/ms180944.aspx)
Creating Test Database with the same schema as that of production - You can make use of SSDT tool in Visual Studio. This will help you to create a database project which mimics the production database and you can use the publish functionality to create database where ever required which will have the same schema

Recreate database from RedGate checked-in scripts

We've got a SQL Server instance with some 15-20 databases, which we check in TFS with the help of RedGate. I'm working on a script to be able to replicate the instance (so a developer could run a local instance when needed, for example) with the help of these scripts. What I'm worried about is the dependencies between these scripts.
In TFS, RedGate has created these folders with .sql files for each database:
Functions
Security
Stored Procedures
Tables
Triggers
Types
Views
I did a quick test with Powershell, just looping over these folders to execute the sql, but I think that might not always work. Is there a strict ordering which I can follow? Or is there some simpler way to do this? To clarify, I want to be able to start with an completly empty SQL Server instance, and end up with a fully configured one according to what is in the TFS (without data, but that is ok). Using Powershell is not a requirement, so if it is simpler to do some other way, that is preferrable.
If you're already using RedGate they have a ton of articles on how to move changes from source control to database. Here's one which describes moving database code from TFS using sqcompare command-line:
http://www.codeproject.com/Articles/168595/Continuous-Integration-for-Database-Development
If you compare to any empty database it will create the script you are looking for.
The only reliable way to deploy the database from scripts folders would be to use Red Gate SQL Compare. If you run the .sql files using PowerShell, the objects may not be created in the right order. Even if you run them in an order that makes sense (functions, then tables, then views...), you still may have dependency issues.
SQL Compare reads all of the scripts and uses them to construct a "virtual" database in memory, then it calculates a dependency matrix for it so when the deployment script is created, things are done in the correct order. That will prevent SQL Server from throwing dependency-related errors.
If you are using Visual Studio with the database option it includes a Schema Compare that will allow you to compare what is in the database project in TFS to the local instance. It will create a script for you to have those objects created in the local instance. I have not tried doing this for a complete instance.
You might have to at most create the databases in the local instance and then let Visual Studio see that the tables and other objects are not there.
You could also just take the last backup of each database and let the developer restore them to their local instance. However this can vary on each environment depending on security policy and what type of data is in the database.
I tend to just use PowerShell to build the scripts for me. I have more control over what is scripted out when so when I rerun the scripts on the local instance I can do it in the order it needs to be done in. May take a little more time but I get better functioning scripts for me to work with, and PS is just my preference. There are some good scripts already written in the SQL Community that can help you on this. Jen McCown did a blog post of all the post her husband has written for doing just this, right here.
I've blogged about how to build a database from a set of .sql files using the SQL Compare command line.
http://geekswithblogs.net/SQLDev/archive/2012/04/16/how-to-build-a-database-from-source-control.aspx
The post is more from the point of view of setting up continuous integration, but the principles are the same.

how to automate upsizing from Access to SQL Server?

I need to automate the migration from an Access (2003) to an SQL Server DB (2005 or 2008). The upsizing should be done automatically as part of a build process.
I need that because there are 2 versions of the software, a single user rich client and a web version. Access DB is used for single user to minimize setup effort, SQL Server to improve performance and scaling with many simultanious users. Access should be the "leading" DB, meaning devs do changes in Access DB and those are propagated to the SQL server within the build process. Many changes will occur, so doing it manually is not an option.
I am new to the Microsoft world, so I dont know appropriate tools for that. What tools can I use and how? I know how to do it (by clicking) with the upsizing assistant. Perhaps I can automate that somehow?
Thanks in advance for your answers.
Cheers,
Arne
Why not use a SQL Server backend for The Access database? Then changes are only made once and all is in synch.
The problem here is that you trying to do versing with two different systems.
If all of the design changes can be done in Access, then you can simply up-size that to sql server. The problem here is do you need to modify the existing sql server databases? And, do you need this over time? Upsizing the access database will get the new version up to sql server, but it WILL NOT help for existing sql server databases.
If you need over time to have changes to tables on BOTH systems and existing systems then I would state to the access developers that ALL changes to the tables MUST be written as ddl sql. You then use the ddl executed on Access to make the changes. The developer MUST THEN AT THAT POINT take that same ddl and run it on the sql side. It might need slight modifications. They fix it to run on sql server. So, then you wind up with two scripts. One for sql server, and one for Access. In the case of access you can write a few lines of code to read and process a text file of ddl. (that same code can be used for all scripts).
That way, you build up a equivalent set of changes for each upgrade. So, simply NEVER allow design changes direct to the access tables. If you allow design changes with the table designer then you are in big trouble as you don't now know what changes need to be carried over to sql server.
However, as better recommend approach I would suggest that table desing changes are done FIRST in sql server. The reason is the table designer in sql visual studio tools allow the changes just made to be scripted to the clipboard (a simple right click). The other reason is that the table designer is VERY much like the access one. So, this is easy even for access developers with little sql server experience to use. So, that simple right click of "script changes to clip board" would create the need change script for sql server (they would save this into a file for sql server). They then take the ddl, and run it on the access side. The reason for this is the ddl might need some slight mods for Access (but, they did not have to write the ddl statement). So, this means the ddl gets written for them. This means you know the sql server ddl is correct. They then simply take that ddl and use again to change the access table also. The ddl might need slight changes for access, but that's just fine because they need the table change and then run and fix this untill the simple table change is made. So this scripted ddl would be then saved for the access update into a file also. The ddl script might need slight modifications for access so I think this approach is best. As long as this is being changed/fixed as they go along you be fine.
So over a period of changes, you wind up with two scripts that keep the changes in sync for both systems.
If you don't need changes to existing sql and access databases, then just let the access folks have at it, and then simply up size the whole database to sql server in one shot and you done.
I can't really think much of any other approach between the above two extremes.
You could have the users make the change in access and then go to sql server to make the change in the sql table design and then save the sql server change script. However, this would not get you a change script for the access side. If a change script is NOT needed for access side then this would work very well as you would get a sql server change script.
The above would be how I go about this and it quite workable...

Tool to copy SQL Server 2008 db to SQL Server 2008 Express?

I have a typical dev scenario: I have a SQL 2008 database that I want to copy every so often to my local instance of 2008 Express so that I can do dev, make changes, etc. to the local copy. I have some constraints though: the source db is part of a live e-commerce site in shared hosting so I can't detach it and the hosting service wants me to pay $5 for each ad hoc back up I invoke.
What I'd like is some tool that I can invoke ad hoc to take a snapshot (complete, not incremental) of the live db that I can then import in my local one. I've tried the SSMS 2008 Copy Database Wizard but it gives me an error saying I can't do that with Express. I tried the Generate Scripts tool and thought that was going to make it - the export to my local disk worked but when I went to import using SQLCMD (the script was 1GB so SSMS errored when I tried to open it there), it told me there was a syntax error a few thousand lines in.
Coming from the MySQL world, this process is trivial. All I want is an analog of mysqldump and then a command-line way to import that file into a db. Surely there's an easy way to do this in the SQL Server world? This seems like the most basic use-case for developers.
[ Yes, I've seen a few other questions here that seem similar but I didn't think they had the same constraints. ]
Best answer: full backup, restore, pay $5. Anything else seems to me like it'd waste a lot more than $5 worth of time.
If they don't charge you to run queries against the database these tools may help. Granted these are not free tools, but are handy on so many fronts it would be worth buying one. These tools can diff your source db and target db both data and structure or just one or the other, and optionally sync the target database to be just like the source.
http://www.innovartis.co.uk/
http://www.red-gate.com/products/sql%5Fdata%5Fcompare/index.htm
Try SQL Dumper.
SQL Server Dumper enables you to dump selected SQL Server database tables into SQL INSERT statements, that are saved as local .sql files and contain all the data required to create a duplicate table, or to be used for backup purposes. You can choose to create an individual .sql file for each table, or combine all selected tables into a single file.
SQL Server Database Publishing Wizard and osql usually do the trick for me with large databases.

Restore SQL Server 2008 database to SQL Server 2000

I have to move an entire database from a SQL Server 2008 machine to a SQL Server 2000 machine.
I created a backup using Management Studio 2008, copied it to the hard drive of the 2000 box, and from withing Management Studio 2008, I choose Restore Database to the 2000 box.
I get an error message stating, "The media family on device ... is incorrectly formed. SQL Server cannot restore this media family".
If I use Enterprise Manager 2000 I get the same error.
Is there a way to move a whole database from the newer SQL server to the older?
The only thing I can think of is to recreate the whole structure and then copy data from a live database. So, create scripts that will create the tables, views, and sp's, and then create scripts to copy the data from the existing database.
As others already said there is no default way to do this. It’s just not supported. Here are more extensive details on how to do this properly and avoid any migration issues.
You need to generate scripts for structure and data and then execute these on SQL 2000 (like others already said) but there are couple things to take into account.
Generate scripts in SSMS
Make sure to check option for scripting data for SQL 2000 to avoid issues when trying to create something like geography type column on SQL 2000.
Make sure to review execution order of scripts to avoid dependency based errors
This is a great option for small to medium size databases and requires some knowledge of SQL Server (dependencies, differences between versions and such)
Third party tools
Idea is to use third party database comparison tools such as ApexSQL Diff or Data Diff
Good side is that these will take care of script execution and differences between versions
Not so good is the fact that you’ll need to pay for these after trial ends
I’ve used these two tools successfully but you can’t go wrong with any other tool on the market. Here is a list of other tools in this category.
you can't move backups from a newer version to an older, in that case you can script your database, execute it in the 2000 box, then you can use the standard data transfer to transfer any data you want
Provided you have a network connection between the machines use SSIS. Much easier and a lot less messing around.
You can use Script Generator for your database and then select in the properties form : General-> Script for server version : SQL Server 2000.
The script generator will show you things which not compatible with your server version.
I've heard you can only do it by generating the SQL statement dump from the DB administrator tool and re-running those queries on the target older database.
You can generate a script that will recreate all the objects and transfer all the data...as long as everything in the db is valid in SQL 2000. So no ROW_NUMBER(), no PARTITION, no CTEs, no datetime2, hierarchy or several other field types, no EXECUTE AS, and lots of other goodness. Basically, there's a pretty good chance it's not possible unless your db is pretty basic.
We got a similar situation. A very low-tech but handy solution is:
backup and truncate the tables in SQL 2000.
create a LINKED server in SQL 2008, pointing to SQL 2000
run a select query at sysobjects to generate a query script for insert into LINKED SERVER.table select * from table
execute query script.

Resources