Tool to copy SQL Server 2008 db to SQL Server 2008 Express? - sql-server

I have a typical dev scenario: I have a SQL 2008 database that I want to copy every so often to my local instance of 2008 Express so that I can do dev, make changes, etc. to the local copy. I have some constraints though: the source db is part of a live e-commerce site in shared hosting so I can't detach it and the hosting service wants me to pay $5 for each ad hoc back up I invoke.
What I'd like is some tool that I can invoke ad hoc to take a snapshot (complete, not incremental) of the live db that I can then import in my local one. I've tried the SSMS 2008 Copy Database Wizard but it gives me an error saying I can't do that with Express. I tried the Generate Scripts tool and thought that was going to make it - the export to my local disk worked but when I went to import using SQLCMD (the script was 1GB so SSMS errored when I tried to open it there), it told me there was a syntax error a few thousand lines in.
Coming from the MySQL world, this process is trivial. All I want is an analog of mysqldump and then a command-line way to import that file into a db. Surely there's an easy way to do this in the SQL Server world? This seems like the most basic use-case for developers.
[ Yes, I've seen a few other questions here that seem similar but I didn't think they had the same constraints. ]

Best answer: full backup, restore, pay $5. Anything else seems to me like it'd waste a lot more than $5 worth of time.

If they don't charge you to run queries against the database these tools may help. Granted these are not free tools, but are handy on so many fronts it would be worth buying one. These tools can diff your source db and target db both data and structure or just one or the other, and optionally sync the target database to be just like the source.
http://www.innovartis.co.uk/
http://www.red-gate.com/products/sql%5Fdata%5Fcompare/index.htm

Try SQL Dumper.
SQL Server Dumper enables you to dump selected SQL Server database tables into SQL INSERT statements, that are saved as local .sql files and contain all the data required to create a duplicate table, or to be used for backup purposes. You can choose to create an individual .sql file for each table, or combine all selected tables into a single file.

SQL Server Database Publishing Wizard and osql usually do the trick for me with large databases.

Related

Duplicate localDB under SQL Servers on my laptop

I've been running into an issue recently when I attempt any tutorials that involve using a SQL database, entity framework, dapper, etc.
When it comes time to publish a database, or utilize an ORM, I'm given duplicate options for the same localdb under SQL Servers. Furthermore, then I attempt to publish, the database doesn't show up under the localdb that I've chosen.
I'm wondering how I go about removing the other SQL Servers and just having the one available.
If you look at the image below, the Browse option gives me two of the same LocalDbs. Plus I also get a 3rd one under \ProjectModels. I'm wondering what's causing this and how it can be fixed since no matter which one I choose, the sql database I attempt to publish doesn't show up within any of them.
My advice is not to use this method to publish the database. (right click to delete)
Please refer to this official documentation.
File-based databases like SQLite or SQL Server Express are designed to store their data in easily transferable files that can be served with your application/site.
"Copy to Output Directory" Property of the database file to "Copy if newer". Just point the address to it.
If you are using a server-based database like SQL Server, MySQL, etc., you need to make sure that the target machine/environment has the same database server installed, and you need to write a deployment script to append the pre-populated data files to the server. This might be troublesome for you.
You can also refer to these links. 1,2,3

Trouble with Copy Database Wizard between two SQL 2008R2 Servers

I am trying to use Copy Database Wizard to copy from my live server (shared hosting) to my local machine. Both the live and local servers are SQL 2008 R2.
I have used CDW for several years with perfect success when copying from a live SQL 2000 server to my local 2008 R2. But now that I have migrated my live database to SQL 2008 R2 the CDW is giving me this error:
Could not read metadata, possibly due to insufficient access rights.
I've learned that this error can be predicted before you even complete the CDW setup: On the page where the CDW asks you for your desired destination database name, it is SUPPOSED to populate the .mdf and .ldf files with their name-to-be and size (e.g. MB, GB).
But in my case these file names and sizes are not being shown (area is simply blank in the wizard) and then of course when I attempt to execute the package it gives me the error.
After much research I believe that reason for this error is due to the CDW requirement of "You must be a member of the sysadmin fixed server role on both the source and destination servers."
On my local server, my Windows Authentication login is listed as a Role Member for the sysadmin Server Role. However on my live server (keep in mind it is a shared SQL server with 250+ databases) the only Role Member listed is [sa].
Am I right in thinking that the only way to satisfy this requirement would be to add my specific SQL user to the live/source Server > Security > Server Roles > sysadmin role? I'm guessing that would never be done on a shared server right? Or is there some other way to make it work by messing with the specific database properties/users/roles?
I can't explain why CDW is working from the live SQL 2000 server and not the 2008 R2. I HOPE it is simply that something isn't set up right on the live database, but maybe it is due to changes that were made to SQL security over the years.
In case it matters, I must use the SMO method instead of detach/attach because it is a live database that I don't want to take down. Historically the CDW from SQL 2000 only takes 3 minutes with SMO method so speed isn't an issue anyway.
Here's my preference for a solution:
Find a way to get CDW to work, most likely by changing something on the live server. Is this possible? What would it be?
If that fails, then...
What about an idea of using CDW to create the package, but then going into to BIDS and manipulating something in the package to circumvent the sysadmin role requirement. (Does it really need the metadata? I don't need anything beside the actual data tables.) Is this possible?
UPDATE 6/14/2016: Editing a CDW package in BIDS won't work as it appears to simply use the .mdf and .ldf files, which of course I don't have access to on the shared server. I think an alternative is to use Import/Export Wizard to create a package, then edit in BIDS. The annoying part is that without access to metadata the Import/Export Wizard doesn't seem to be aware of Foreign Keys, and thus doesn't know what order to process the tables in.
If that fails, then...
Is there any other way to easily automate a daily copy from my live server to local machine? The reason I like CDW is because it is super simple to use (when it works), it can be scheduled to run daily as a SQL agent job, and requires no manual work on my part. Is there a "next best thing" if CDW can't be made to work?
You'd think that a very common scenario for all websites out there would be "how do I get a copy of my live database onto my local SQL server, daily, automatically"? But maybe I'm the weird one!
Another simple solution would be the Import/Export Wizard.
In SSMS right-click the database you want transferred and select 'Tasks' and then 'Export Data...'. It will open a wizard that is very similar to that of CDW. The difference here is that I could not find a sysadmin requirement to use it.
At the end it will give the option to run immediately and/or save the SSIS package. If you save the SSIS package (I prefer to save it to disk) you can then create a schedule via a SQL Agent job.

Transferring a SQL Database on AWS

We have a SQL Server 2005 database on our local server.
Here it is (ofc i've to repeat the proceder for the other databases):
I've to transfer it to our SQL server 2012 instance on Amazon RDS.
I right clicked the database and selected Generate Scripts - All tables - Copy Schema and Data and saved everything as a sql file
At this point I attempted to use the SQL Azure MW v5.15 (in a question here I saw that it works with AWS too, way to go Microsoft!) to transfer the database on AWS.
However it crashes.
No problem, I try to use SQL Management studio to import the file but as soon the RAM consumed by the program reaches 1gb (as you can see that DB is 3,4gb) BOOM - out of memory error!
What should I do now?
You'll need to do part-by-part of your creation. I'd faced that problem some time before, my scripts reaches like 4 GB, only with the schemas, tables, etc. So, I think you should first of all, generate your scripts of creating schemas, users and logins. After that, tables, views and procedures. Then, another objects, like jobs, functions... To conclude, all the data you have, you should export to the RDS through the IMPORT/EXPORT Wizard in SSMS.
I've followed that steps and it worked for me.
Good luck!

Run the same replication script on several databases

We have several SQL Server 2000 databases (I know, we need to upgrade) that have the same structure and have them set up to replicate to another server. The problem is that whenever I have to change the structure (which is usually pretty easy to do on all databases, especially with tools from Red Gate) I have to stop the replication, make the structure changes, and then set up replication again. The steps to set up replication only take a few minutes to do for each database but it's repetitive and drives me crazy. I have the IDE create a script of the replication procedure and then just replace the name of the prior database with the name of the next database and run the script. Still annoying but faster than clicking though the IDE and forgetting an option.
I've tried things like using the "SP_MSFOREEACHDB" but that didn't look very promising.
My guess is I should just use the TSQL that gets generated from the IDE and use that as a starting point to build a new TSQL script have it pass the name of the database as a parameter. And then when something changes with the structure of the database I need to address that in the TSQL replication script and make the changes there. Is this an issue for anyone else? Does 2005 or 2008 have a better "SP_MSFOREACHDB" so I wouldn't have to mainting some crazy script and just have the IDE generate a script when there are changes that I could then use on multiple databases easily?
Are you using SQL Server replication? If so, why aren't you making the changes to the publishing database and letting it push out the schema changes to its subscribers? We do this occasionally on SQL 2005 and it works well for the most part; I don't have any experience with replication on 2000 servers.
If you can use SQL Server Management studio, then the SSMS tools pack has a widget to allow the same script to be run on different databases.
SQLCMD tool can connect to SQL Server 2000 and enable interaction from command line. Using parametrized queries and a fixed set of .bat files (one for each server) can be a good alternative for what you do now.

mysqldump equivalent for SQL Server

Is there an equivalent schema & data export/dumping tool for SQL Server as there is for MySQL with mysqldump. Trying to relocate a legacy ASP site and I am way out of happy place with working on a windows server.
Note: The DTS export utility own seems to export data, without table defs.
Using the Enterprise Manager and exporting the db gets closer with exporting the schema & data... but still misses stored procedures.
Basically looking for a one does it all solution that grabs everything I need at once.
To do this really easily with SQL Server 2008 Management Studio:
1.) Right click on the database (not the table) and select Tasks -> Generate Scripts
2.) Click Next on the first page
3.) If you want to copy the whole database, just click next. If you want to copy specific tables, click on "Select Specific Database Objects", select the tables you want, and then click next.
4.) "Save to File" should be selected. IMPORTANT: Click the Advanced button next to "Save to File", find "Types of data to script", and change "Schema only" to "Schema and data" (if you want to create the table) or "Data only" (if you're copying data to an existing table).
5.) Click through the rest and you're done! It will save as a .sql file.
The easiest way is the sql server database publishing wizard.
Open source
Free
Does exactly what you want
Developed by microsoft
It does not have all the features of mysqldump but it is close enough.
http://www.codeplex.com/sqlhost/wiki/view.aspx?title=database%20publishing%20wizard
The easiest way to move a Database would be to use SQL Server Management Studio to Export the database to another server, or if that doesn't work, make a backup like other's had suggested and restore it elsewhere.
If you are looking for a way to dump the table structure to SQL as well as create insert scripts for the data a good free option would be to use amScript and amInsert from http://www.asql.biz/en/Download2005.aspx.
If you want a good pay version I would check out Red-Gate SQL Compare and Red-Gate SQL-Data Compare. These tools are probably overkill though and probably a bit pricey if you don't intend to use them a lot. I would think it would mostly be relegated to DBAs. You can look at the Red-Gate tools at http://www.red-gate.com/.
Not finding the right tool, I decided to create my own: a sqlserverdump command line utility. Check it out on github.
Even easier is to use the SMO API. It lets you do exactly like mysqldump, and even better. Here is a code example:
http://samyem.blogspot.com/2010/01/automate-sql-dumps-for-sqlserver.html
easiest would be a backup and restore or detach and attach
or script out all the tables and BCP out the data then BCP in the data on the new server
or use DTS/SSIS to do this
SQL Enterprise manager or SQL Server Management studio have wizard based approaches, and the latter will generate the scripts so you can see how its done.
You could also use the BACKUP and RESTORE commands. More detail here: http://msdn.microsoft.com/en-us/library/ms189826.aspx
If you can get DTS or Integration Services to connect to both servers, you can use the wizards to 'copy objects' from one server to another. 'Copy Database' requires that the two servers can authenticate with each other, which typically means being on the same domain and that the service runs under a domain logon.
Otherwise, you can generate a script for the schema, and you can use an Integration Services/DTS package to export data to a file, then import it on the other.
We now generally use SQL Compare and SQL Data Compare. Red Gate's SQL Packager might also be an option.
Well, Mysqldump is a series of SQL statements. You can do this with DTS, but why not just create a backup and restore it on your new machine?
If you want to do it via SQL:
http://msdn.microsoft.com/en-us/library/aa225964(SQL.80).aspx
Or just right click the DB and hit Tasks -> Backup (http://msdn.microsoft.com/en-us/library/ms187510.aspx)
Two things a backup/restore won't do:
Get off of a Microsoft server, which
was part of the original question
Help quickly find a structural difference
between two DBs that are supposed to
have the same structure when one of
them is running slowly. Unix diff,
or sdiff, ignoring white space but
need a way to make input files.
If you need equivalent SQL statements like CREATE TABLE... & INSERT INTO..., then I recommend you try HeidiSQL. It's a fantastic and free utility that can access Microsoft SQL Server, MySQL and PostgreSQL. It enables you to browse and edit data, create and edit tables, views, procedures, triggers and scheduled events. Also, you can export structure and data to SQL file.
http://www.heidisql.com
Go to Tools / Export database as SQL and select the schema.
Check the box to create the tables and “Insert” data. That’s it.
I prefer HeidiSQL to "Microsoft SQL Server Management Studio" or phpMyAdmin... etc.

Resources