How to take table-level backup (dump) in MS SQL Server 2005/2008?
You cannot use the BACKUP DATABASE command to backup a single table, unless of course the table in question is allocated to its own FILEGROUP.
What you can do, as you have suggested, is export the table data to a CSV file. Now in order to get the definition of your table you can 'Script out' the CREATE TABLE script.
You can do this within SQL Server Management Studio, by:
right clicking Database > Tasks > Generate Script
You can then select the table you wish to script out and also choose to include any associated objects, such as constraints and indexes.
in order to get the DATA along with just the schema, you've got to choose Advanced on the set scripting options tab, and in the GENERAL section set the Types of data to script select Schema and Data.
I am using the bulk copy utility to achieve table-level backups
to export:
bcp.exe "select * from [MyDatabase].dbo.Customer " queryout "Customer.bcp" -N -S localhost -T -E
to import:
bcp.exe [MyDatabase].dbo.Customer in "Customer.bcp" -N -S localhost -T -E -b 10000
as you can see, you can export based on any query, so you can even do incremental backups with this. Plus, it is scriptable as opposed to the other methods mentioned here that use SSMS.
Here are the steps you need. Step5 is important if you want the data. Step 2 is where you can select individual tables.
EDIT stack's version isn't quite readable... here's a full-size image http://i.imgur.com/y6ZCL.jpg
You can run the below query to take a backup of the existing table which would create a new table with existing structure of the old table along with the data.
select * into newtablename from oldtablename
To copy just the table structure, use the below query.
select * into newtablename from oldtablename where 1 = 2
This is similar to qntmfred's solution, but using a direct table dump. This option is slightly faster (see BCP docs):
to export:
bcp "[MyDatabase].dbo.Customer " out "Customer.bcp" -N -S localhost -T -E
to import:
bcp [MyDatabase].dbo.Customer in "Customer.bcp" -N -S localhost -T -E -b 10000
If you're looking for something like MySQL's DUMP, then good news: SQL Server 2008 Management Studio added that ability.
In SSMS, just right-click on the DB in question and select Tasks > Generate Scripts. Then in the 2nd page of the options wizard, make sure to select that you'd like the data scripted as well, and it will generate what amounts to a DUMP file for you.
Create new filegroup, put this table on it, and backup this filegroup only.
You can use the free Database Publishing Wizard from Microsoft to generate text files with SQL scripts (CREATE TABLE and INSERT INTO).
You can create such a file for a single table, and you can "restore" the complete table including the data by simply running the SQL script.
I don't know, whether it will match the problem described here. I had to take a table's incremental backup! (Only new inserted data should be copied). I used to design a DTS package where.
I fetch new records (on the basis of a 'status' column) and transferred the data to destination. (Through 'Transform Data Task')
Then I just updated the 'status' column. (Through 'Execute SQL Task')
I had to fix the 'workflow' properly.
Use SQL Server Import and Export Wizard.
ssms
Open the Database Engine
Alt. click the database containing table to Export
Select "Tasks"
Select "Export Data..."
Follow the Wizard
Every recovery model lets you back up
a whole or partial SQL Server database
or individual files or filegroups of
the database. Table-level backups
cannot be created.
From: Backup Overview (SQL Server)
You probably have two options, as SQL Server doesn't support table backups. Both would start with scripting the table creation. Then you can either use the Script Table - INSERT option which will generate a lot of insert statements, or you can use Integration services (DTS with 2000) or similar to export the data as CSV or similar.
BMC Recovery Manager (formerly known as SQLBacktrack) allows point-in-time recovery of individual objects in a database (aka tables). It is not cheap but does a fantastic job:
http://www.bmc.com/products/proddocview/0,2832,19052_19429_70025639_147752,00.html
http://www.bmc.com/products/proddocview/0,2832,19052_19429_67883151_147636,00.html
If you are looking to be able to restore a table after someone has mistakenly deleted rows from it you could maybe have a look at database snapshots. You could restore the table quite easily (or a subset of the rows) from the snapshot. See http://msdn.microsoft.com/en-us/library/ms175158.aspx
A free app named SqlTableZip will get the job done.
Basically, you write any query (which, of course can also be [select * from table]) and the app creates a compressed file with all the data, which can be restored later.
Link:
http://www.doccolabs.com/products_sqltablezip.html
Handy Backup automatically makes dump files from MS SQL Server, including MSSQL 2005/2008. These dumps are table-level binary files, containing exact copies of the particular database content.
To make a simple dump with Handy Backup, please follow the next instruction:
Install Handy Backup and create a new backup task.
Select “MSSQL” on a Step 2 as a data source. On a new window, mark a database to back up.
Select among different destinations where you will store your backups.
On a Step 4, select the “Full” backup option. Set up a time stamp if you need it.
Skip a Step 5 unless you have a need to compress or encrypt a resulting dump file.
On a Step 6, set up a schedule for a task to create dumps periodically (else run a task manually).
Again, skip a Step 7, and give your task a name on a Step 8. You are finished the task!
Now run your new task by clicking on an icon before its name, or wait for scheduled time. Handy Backup will automatically create a dump for your database.
Then open your backup destination. You will find a folder (or a couple of folders) with your MS SQL backups. Any such folder will contains a table-level dump file, consisting of some binary tables and settings compressed into a single ZIP.
Other Databases
Handy Backup can save dumps for MySQL, MariaDB, PostgreSQL, Oracle, IBM DB2, Lotus Notes and any generic SQL database having an ODBC driver. Some of these databases require additional steps to establish connections between the DBMS and Handy Backup.
The tools described above often dump SQL databases as table-level SQL command sequence, making these files ready for any manual modifications you need.
Related
I need to get a dev database for my local setup at work. I have run into one major problem, one of the tables (which accounts for 99% of the activity on the database) is about 7GB. It's not necessary that I get all of the rows from this table, the top 10,000 or so would work.
Is there anyway in a script or anything where I can copy a database and specify how many rows from a certain table? I know its a longshot. I just don't need to spend half of my day restoring a database because of the size of one table.
If you want to be fancy (and who doesn't?), you could have that table on its own filegroup in production and then restore all but that filegroup in dev. Then you could create the table fresh upon restore, import some data (I'd write a simple SSIS package myself for that) and you should be good.
You can actually try few things to export (and then import) data, you can use any of following methods,
SSIS package
SQL Server Import Export Wizard in SSMS (which can also create SSIS package)
Use Query Result tab, and save results as CSV
Use BCP with QueryOut flag to export data
Use SQLCMD with -o flag
I'm using SQL Server[s] 2008 R2 [Express].
I would like to create an installation file.sql file for an example database.
In file.sql I mean a file that will run from Microsoft SQL Server Management Studio as a query.
I would like to have a single file including the database itself and all tables and data.
How do I export an entire database into such a file?
P.S.
It is a very small database.
I do not worry about database name duplicate on the new server.
There are Unicode characters and many special characters in data including {[(<`.,'|"?*&/>)]}
In management studio
Right click on Database name
choose
Tasks > Generate scripts
Next, Choose "Script entire database"
Next, go to Advanced
Types of data to script => choose "Schema and Data" (and check other options for your needs)
Select file to save to
Finish.
EDIT :
If it's used in Management Studio only, you'll be faster using Backup and Recovery !
I would just add the caveat that if you have large amounts of data (> 1000 records) then you need to batch the insert statements into groups of <= 1000 inserts at a time. There is a limitation on batch size.
You may also need to pay attention to the order in which inserts occur to avoid constraint conflicts. A safe approach would be to extract the FK constraints into a separate file which is executed last.
Our SQL Server 2000 instance hosts several databases which are all similar, one for each of our client. When comes the time to update them all, we use Red Gate SQL Compare to generate a migration script between the development database and a copy of the current state DB of all the clients database.
SQL Compare generates a script which is transactional, if one step fails, the script rolls back everything. But currently our system uses a method that splits the script on batch separators (the GO statement) and then runs each command separately, which ruins all the transactional stuff. The GO statement is not supported when querying the database by programmation (in classic ASP)
I want to know how I could run that script (keeping the transactions) on all those databases (like 250 DB), programmatically or manually in a tool? In Query Analyzer, we need to select each DB and press Run which is quite long for the number of DB we have.
If you can use SSMS from SQL 2005 or 2008, then I'd recommend the free SSMS Tool pack
I use external sqlcmd command line tool. I have the same situation on the server I work.
I have the script in *.sql file and the list of databases on the 2nd file. I have small *.bat script which iterate through all the databases and execute script using sqlcmd command.
In more details I have like this:
DB.ini file with all the databases on which I want to deploy my script
sql/ directory where I store all scripts
runIt.bat - script which deploys scripts
The command line looks more-less like this:
sqlcmd -S <ComputerName>\<InstanceName> -i <MyScript.sql> -d <database_name> -T
In SQL Server 2000 it was osql utility
UPDATE
Red Gate now have a tool called SQL Multi Script, which basically does exactly what you want. I supports SQL 2000 to 2008 R2 and running queries on multiple databases in parallel which improve performance.
7 years later i had the same issue so many times so I made it and published the project:
TAKODEPLOY
Here are some features:
Get all databases from a single instance and apply a name filter. Or just a single direct connection.
Mix database sources as much as you want. Example, two direct and one full instance with or withut a filter.
Script editor (Avalon Text, same monodevelop uses)
Scripts are parsed and errors are detected before executing.
Scripts are 'splitted' by GO statements.
Save your deployment into a file
Get a list of all databases before deploying.
See in realtime what is happening (PRINT statements are recommended here!).
Automatic rollback to independent database if any error occurs.
Transparent Updates via Squirrel.
You can get it at: https://github.com/andreujuanc/TakoDeploy
Not sure if this will work, but try replacing the GO statements with semicolons, and running the entire statement in one batch.
If I recall, you can also create a script in SQL Compare to change everything back to the state it started in. You might want to generate both.
When I did this sort of deployment (it's been awhile), I first loaded to a staging server that was made exactly like prod before I started to make sure the scripts would work on prod. If anything failed (usually because of the order that scripts were run, can't set a foreign key to a table that doesn't exist yet for instance). I also scripted al table changes first, then all view changes, then all UDF changes, then all stored proc changes. This cut down greatly onthe failures due to objects not yet existing, but I still usually had a few that needed to be adjusted.
Dumb question - what's the best way to copy instances in an environment where I want to refresh a development server with instances from a production server?
I've done backup-restore, but I've heard detach-copy-attach and one guy even told me he would just copy the datafiles between the filesystems....
Are these the three (or two, the last one sounds kind of suspect) accepted methods?
My understanding is that the second method is faster but requires downtime on the source because of the detach aspect.
Also, in this situation (wanting an exact copy of production on a dev server) what's the accepted practice for transferring logins,etc.? Should I just backup and restore the user databases + master + msdb?
Easiest way is actually a script.
Run this on production:
USE MASTER;
BACKUP DATABASE [MyDatabase]
TO DISK = 'C:\temp\MyDatabase1.bak' -- some writeable folder.
WITH COPY_ONLY
This one command makes a complete backup copy of the database onto a single file, without interfering with production availability or backup schedule, etc.
To restore, just run this on your dev or test SQL Server:
USE MASTER;
RESTORE DATABASE [MyDatabase]
FROM DISK = 'C:\temp\MyDatabase1.bak'
WITH
MOVE 'MyDatabase' TO 'C:\Sql\MyDatabase.mdf', -- or wherever these live on target
MOVE 'MyDatabase_log' TO 'C:\Sql\MyDatabase_log.ldf',
REPLACE, RECOVERY
Then save these scripts on each server. One-click convenience.
Edit:
if you get an error when restoring that the logical names don't match, you can get them like this:
RESTORE FILELISTONLY
FROM disk = 'C:\temp\MyDatabaseName1.bak'
If you use SQL Server logins (not windows authentication) you can run this after restoring each time (on the dev/test machine):
use MyDatabaseName;
sp_change_users_login 'Auto_Fix', 'userloginname', null, 'userpassword';
The fastest way to copy a database is to detach-copy-attach method, but the production users will not have database access while the prod db is detached. You can do something like this if your production DB is for example a Point of Sale system that nobody uses during the night.
If you cannot detach the production db you should use backup and restore.
You will have to create the logins if they are not in the new instance. I do not recommend you to copy the system databases.
You can use the SQL Server Management Studio to create the scripts that create the logins you need. Right click on the login you need to create and select Script Login As / Create.
This will lists the orphaned users:
EXEC sp_change_users_login 'Report'
If you already have a login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user'
If you want to create a new login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user', 'login', 'password'
UPDATE:
My advice below tells you how to script a DB using SQL Server Management Studio, but the default settings in SSMS miss out all sorts of crucial parts of a database (like indexes and triggers!) for some reason. So, I created my own program to properly script a database including just about every type of DB object you may have added. I recommend using this instead. It's called SQL Server Scripter and it can be found here:
https://bitbucket.org/jez9999/sqlserverscripter
I'm surprised no-one has mentioned this, because it's really useful: you can dump out a database (its schema and data) to a script, using SQL Server Management Studio.
Right-click the database, choose "Tasks | Generate Scripts...", and then select to script specific database objects. Select the ones you want to copy over to the new DB (you probably want to select at least the Tables and Schemas). Then, for the "Set Scripting Options" screen, click "Advanced", scroll down to "Types of data to script" and select "Schema and data". Click OK, and finish generating the script. You'll see that this has now generated a long script for you that creates the database's tables and inserts the data into them! You can then create a new database, and change the USE [DbName] statement at the top of the script to reflect the name of the new database you want to copy the old one to. Run the script and the old database's schema and data will be copied to the new one!
This allows you to do the whole thing from within SQL Server Management studio, and there's no need to touch the file system.
Below is what I do to copy a database from production env to my local env:
Create an empty database in your local sql server
Right click on the new database -> tasks -> import data
In the SQL Server Import and Export Wizard, select product env's servername as data source. And select your new database as the destination data.
Its hard to detach your production dB or other running dB's and deal with that downtime, so I almost always use a Backup / restore method.
If you also want to make sure to keep your login's in sync check out the MS KB article on using the stored proc sp_help_revlogin to do this.
The detach/copy/attach method will take down the database. That's not something you'd want in production.
The backup/restore will only work if you have write permissions to the production server. I work with Amazon RDS and I don't.
The import/export method doesn't really work because of foreign keys - unless you do tables one by one in the order they reference one another. You can do an import/export to a new database. That will copy all the tables and data, but not the foreign keys.
This sounds like a common operation one needs to do with database. Why isn't SQL Server handling this properly? Every time I had to do this it was frustrating.
That being said, the only painless solution I've encountered was Sql Azure Migration Tool which is maintained by the community. It works with SQL Server too.
I run an SP to DROP the table(s) and then use a DTS package to import the most recent production table(s) onto my development box.
Then I go home and come back the following morning. It's not elegant; but it works for me.
If you want to take a copy of a live database, do the Backup/Restore method.
[In SQLS2000, not sure about 2008:] Just keep in mind that if you are using SQL Server accounts in this database, as opposed to Windows accounts, if the master DB is different or out of sync on the development server, the user accounts will not translate when you do the restore. I've heard about an SP to remap them, but I can't remember which one it was.
I've got a backup made using the regular SQL Server 2005 backup command. Is there a way to restore just a single table, not the whole DB?
Restore the whole database to another machine (or temporary database), then copy the table seems like the easiest to me.
This is not natively supported in SSMS but it’s possible using third party tools.
Apart from Red Gate (great tools btw) you can try SQL Diff (restore object) and SQL Data Diff (restore data) from ApexSQL.
Disclaimer: I’m not affiliated with ApexSQL but we are their customers and use their tools
The unit of backup and recovery in SQL Server is the database (it is the outer boundary of referential integrity).
Red Gate has some pretty good tools for row-level restore (SQL Data Compare and SQL Backup), but they come at a price.
Detach the current database then restore the database with the date of the tbl you need to a new location (make a sub folder) to put it in keep it separate from your production databases, then restore the database to that sub folder, when completed find the tbl you need and script it to a create script file save to a file, your done with the database delete it then reattached the original one, now scroll down to the tbl you want to restore and script it to a create file (this is a backup only) now delete the tbl, make sure your database is selected and the active one next load the the scripted file you just created with the other database in the query analyzer and run it, it should report successful now check to see if your tbl has been replaced. your done