We want to replicate our local test DB automatically, given the structure of the production DB, on the fly / on demand.
What would be nice, would be to query the prod DB and retrieve the SQL that generates the tables/views and then run that SQL against a cleaned out local DB.
Or perhaps there is a better way to replicate a production DB on a local machine?
What is the best way to do this?
If I understood your question correct, you need to copy DB schema. You can use pg_dump with --schema-only parameter to do this. It'll only dump schema of the database, and you can import it locally.
$ pg_dump mydb --schema-only > mydb-schema.sql
I have a source database and a destination database with common tables. Is there any online tool that will sync the schema of tables in the source to destination ? Both databases are in MSSQL
Here are a couple of ideas:
Use Replication: http://www.howtoforge.com/mysql_database_replication
Use mysqldump in an import/export script to semi-automate it
I have a number of tables in a database in the server side in PostgreSQL. I want to have them all in another database. Is it possible?
Use PostgreSQL pg_dump utility with -t table option to define tables that should be dumped and restore them in a another database. For more information see PostgreSQL pg_dump documentation page
How to take table-level backup (dump) in MS SQL Server 2005/2008?
You cannot use the BACKUP DATABASE command to backup a single table, unless of course the table in question is allocated to its own FILEGROUP.
What you can do, as you have suggested, is export the table data to a CSV file. Now in order to get the definition of your table you can 'Script out' the CREATE TABLE script.
You can do this within SQL Server Management Studio, by:
right clicking Database > Tasks > Generate Script
You can then select the table you wish to script out and also choose to include any associated objects, such as constraints and indexes.
in order to get the DATA along with just the schema, you've got to choose Advanced on the set scripting options tab, and in the GENERAL section set the Types of data to script select Schema and Data.
I am using the bulk copy utility to achieve table-level backups
to export:
bcp.exe "select * from [MyDatabase].dbo.Customer " queryout "Customer.bcp" -N -S localhost -T -E
to import:
bcp.exe [MyDatabase].dbo.Customer in "Customer.bcp" -N -S localhost -T -E -b 10000
as you can see, you can export based on any query, so you can even do incremental backups with this. Plus, it is scriptable as opposed to the other methods mentioned here that use SSMS.
Here are the steps you need. Step5 is important if you want the data. Step 2 is where you can select individual tables.
EDIT stack's version isn't quite readable... here's a full-size image http://i.imgur.com/y6ZCL.jpg
You can run the below query to take a backup of the existing table which would create a new table with existing structure of the old table along with the data.
select * into newtablename from oldtablename
To copy just the table structure, use the below query.
select * into newtablename from oldtablename where 1 = 2
This is similar to qntmfred's solution, but using a direct table dump. This option is slightly faster (see BCP docs):
to export:
bcp "[MyDatabase].dbo.Customer " out "Customer.bcp" -N -S localhost -T -E
to import:
bcp [MyDatabase].dbo.Customer in "Customer.bcp" -N -S localhost -T -E -b 10000
If you're looking for something like MySQL's DUMP, then good news: SQL Server 2008 Management Studio added that ability.
In SSMS, just right-click on the DB in question and select Tasks > Generate Scripts. Then in the 2nd page of the options wizard, make sure to select that you'd like the data scripted as well, and it will generate what amounts to a DUMP file for you.
Create new filegroup, put this table on it, and backup this filegroup only.
You can use the free Database Publishing Wizard from Microsoft to generate text files with SQL scripts (CREATE TABLE and INSERT INTO).
You can create such a file for a single table, and you can "restore" the complete table including the data by simply running the SQL script.
I don't know, whether it will match the problem described here. I had to take a table's incremental backup! (Only new inserted data should be copied). I used to design a DTS package where.
I fetch new records (on the basis of a 'status' column) and transferred the data to destination. (Through 'Transform Data Task')
Then I just updated the 'status' column. (Through 'Execute SQL Task')
I had to fix the 'workflow' properly.
Use SQL Server Import and Export Wizard.
ssms
Open the Database Engine
Alt. click the database containing table to Export
Select "Tasks"
Select "Export Data..."
Follow the Wizard
Every recovery model lets you back up
a whole or partial SQL Server database
or individual files or filegroups of
the database. Table-level backups
cannot be created.
From: Backup Overview (SQL Server)
You probably have two options, as SQL Server doesn't support table backups. Both would start with scripting the table creation. Then you can either use the Script Table - INSERT option which will generate a lot of insert statements, or you can use Integration services (DTS with 2000) or similar to export the data as CSV or similar.
BMC Recovery Manager (formerly known as SQLBacktrack) allows point-in-time recovery of individual objects in a database (aka tables). It is not cheap but does a fantastic job:
http://www.bmc.com/products/proddocview/0,2832,19052_19429_70025639_147752,00.html
http://www.bmc.com/products/proddocview/0,2832,19052_19429_67883151_147636,00.html
If you are looking to be able to restore a table after someone has mistakenly deleted rows from it you could maybe have a look at database snapshots. You could restore the table quite easily (or a subset of the rows) from the snapshot. See http://msdn.microsoft.com/en-us/library/ms175158.aspx
A free app named SqlTableZip will get the job done.
Basically, you write any query (which, of course can also be [select * from table]) and the app creates a compressed file with all the data, which can be restored later.
Link:
http://www.doccolabs.com/products_sqltablezip.html
Handy Backup automatically makes dump files from MS SQL Server, including MSSQL 2005/2008. These dumps are table-level binary files, containing exact copies of the particular database content.
To make a simple dump with Handy Backup, please follow the next instruction:
Install Handy Backup and create a new backup task.
Select “MSSQL” on a Step 2 as a data source. On a new window, mark a database to back up.
Select among different destinations where you will store your backups.
On a Step 4, select the “Full” backup option. Set up a time stamp if you need it.
Skip a Step 5 unless you have a need to compress or encrypt a resulting dump file.
On a Step 6, set up a schedule for a task to create dumps periodically (else run a task manually).
Again, skip a Step 7, and give your task a name on a Step 8. You are finished the task!
Now run your new task by clicking on an icon before its name, or wait for scheduled time. Handy Backup will automatically create a dump for your database.
Then open your backup destination. You will find a folder (or a couple of folders) with your MS SQL backups. Any such folder will contains a table-level dump file, consisting of some binary tables and settings compressed into a single ZIP.
Other Databases
Handy Backup can save dumps for MySQL, MariaDB, PostgreSQL, Oracle, IBM DB2, Lotus Notes and any generic SQL database having an ODBC driver. Some of these databases require additional steps to establish connections between the DBMS and Handy Backup.
The tools described above often dump SQL databases as table-level SQL command sequence, making these files ready for any manual modifications you need.
I need to copy about 40 databases from one server to another. The new databases should have new names, but all the same tables, data and indexes as the original databases. So far I've been:
1) creating each destination database
2) using the "Tasks->Export Data" command to create and populate tables for each database individually
3) rebuilding all of the indexes for each database with a SQL script
Only three steps per database, but I'll bet there's an easier way. Do any MS SQL Server experts out there have any advice?
Given that you're performing this on multiple databases -- you want a simple scripted solution, not a point and click solution.
This is a backup script that i keep around.
Get it working for one file and then modify it for many.
(on source server...)
BACKUP DATABASE Northwind
TO DISK = 'c:\Northwind.bak'
(target server...)
RESTORE FILELISTONLY
FROM DISK = 'c:\Northwind.bak'
(look at the device names... and determine where you want the mdf and
ldf files to go on this target server)
RESTORE DATABASE TestDB
FROM DISK = 'c:\Northwind.bak'
WITH MOVE 'Northwind' TO 'c:\test\testdb.mdf',
MOVE 'Northwind_log' TO 'c:\test\testdb.ldf'
GO
Maybe the easiest is to detach/reattach. Right-click in the server manager on the DB, tasks --> detach. Then copy the MDF/LDF files to the new server and then reattach by clicking on the server icon and tasks-->attach. It will ask you for the MDF file - make sure the name etc is accurate.
In order of ease
stop server/fcopy/attach is probably easiest.
backup/restore - can be done disconnected pretty simple and easy
transfer DTS task - needs file copy permissions
replication - furthest from simple to setup
Things to think about permissions, users and groups at the destination server esp. if you're transferring or restoring.
There are better answers already but this is an 'also ran' because it is just another option.
For the low low price of free you could look at the Microsoft SQL Server Database Publishing Wizard. This tool allows you to script the schema, data or data and schema. Plus is can be run from a UI or command line <- think CI process.
Backup -> Restore is the simplest, if not to use the replication.
If you use the Backup/Restore solution you're likely to have orphaned users so be sure to check out this article<microsoft> on how to fix them.
Another one to check out that is quick and simple:
Simple SQL BULK Copy
http://projects.c3o.com/files/3/plugins/entry11.aspx
Backup the databases using the standard SQL backup tool in Enterprise Manager, then when you restore on the second server you can specify the name of the new database.
This is the best way to maintain the schema in its entirety.
use backups to restore the databases to the new server with the new names.
Redgate SQL Compare and SQL Data Compare. The Comparison Bundle was by far the best investment a company I worked for ever made. Moving e-training content was a breeze with it.
Check those links:
For multiple db's backup
and single db restore