Query a postgres database for the SQL which generates the tables - database

We want to replicate our local test DB automatically, given the structure of the production DB, on the fly / on demand.
What would be nice, would be to query the prod DB and retrieve the SQL that generates the tables/views and then run that SQL against a cleaned out local DB.
Or perhaps there is a better way to replicate a production DB on a local machine?
What is the best way to do this?

If I understood your question correct, you need to copy DB schema. You can use pg_dump with --schema-only parameter to do this. It'll only dump schema of the database, and you can import it locally.
$ pg_dump mydb --schema-only > mydb-schema.sql

Related

Perform SQLDump on SQL Server

Is there a way to perform a SQL Dump in a SQL Server? Because on mysql we can easily do that with mysqldump command but on a SQL Server when I try to export the data it seems that it's only allowing to export per table and not per database. Now when I try to do the copy database it tries to copy the source db to a destination db. Now I came from MYSQL and I thought that it will somehow behave the same way.
Any idea on how to achieve this?
Thanks for those who will reply
In SQL Server you would either use Backup and Restore, or extract all the schema and data into a .bacpac file.

Copy tables from one database to the other in PostgreSQL

I have a number of tables in a database in the server side in PostgreSQL. I want to have them all in another database. Is it possible?
Use PostgreSQL pg_dump utility with -t table option to define tables that should be dumped and restore them in a another database. For more information see PostgreSQL pg_dump documentation page

Copy data from local to production

I have 2 copies of database, one on local PC other on production server (web site). I insert data into my local copy. How can I upload the newly inserted data (in local-copy) to production server?
I recommend you generate insert scripts and then run them on the prod server.
There are many tools to do this: a popular one is RedGate SQL Data Compare.
You can create an incremental backup and restore it on prod.
Or use Import/Export Wizard if you can gather the new data using several queries.
Add the other machine as a "Linked Server" under "Server Objects".
Then access it like:
SELECT * FROM ServerPotato.DB_AWE.dbo.potato_prices
which is:
<Linked Server>.<Database>.<Schema Owner>.<Table>
The schema dbo is probably necessary, where normally you may write: DB_AWE..potato_prices

Best way to copy a database (SQL Server 2008)

Dumb question - what's the best way to copy instances in an environment where I want to refresh a development server with instances from a production server?
I've done backup-restore, but I've heard detach-copy-attach and one guy even told me he would just copy the datafiles between the filesystems....
Are these the three (or two, the last one sounds kind of suspect) accepted methods?
My understanding is that the second method is faster but requires downtime on the source because of the detach aspect.
Also, in this situation (wanting an exact copy of production on a dev server) what's the accepted practice for transferring logins,etc.? Should I just backup and restore the user databases + master + msdb?
Easiest way is actually a script.
Run this on production:
USE MASTER;
BACKUP DATABASE [MyDatabase]
TO DISK = 'C:\temp\MyDatabase1.bak' -- some writeable folder.
WITH COPY_ONLY
This one command makes a complete backup copy of the database onto a single file, without interfering with production availability or backup schedule, etc.
To restore, just run this on your dev or test SQL Server:
USE MASTER;
RESTORE DATABASE [MyDatabase]
FROM DISK = 'C:\temp\MyDatabase1.bak'
WITH
MOVE 'MyDatabase' TO 'C:\Sql\MyDatabase.mdf', -- or wherever these live on target
MOVE 'MyDatabase_log' TO 'C:\Sql\MyDatabase_log.ldf',
REPLACE, RECOVERY
Then save these scripts on each server. One-click convenience.
Edit:
if you get an error when restoring that the logical names don't match, you can get them like this:
RESTORE FILELISTONLY
FROM disk = 'C:\temp\MyDatabaseName1.bak'
If you use SQL Server logins (not windows authentication) you can run this after restoring each time (on the dev/test machine):
use MyDatabaseName;
sp_change_users_login 'Auto_Fix', 'userloginname', null, 'userpassword';
The fastest way to copy a database is to detach-copy-attach method, but the production users will not have database access while the prod db is detached. You can do something like this if your production DB is for example a Point of Sale system that nobody uses during the night.
If you cannot detach the production db you should use backup and restore.
You will have to create the logins if they are not in the new instance. I do not recommend you to copy the system databases.
You can use the SQL Server Management Studio to create the scripts that create the logins you need. Right click on the login you need to create and select Script Login As / Create.
This will lists the orphaned users:
EXEC sp_change_users_login 'Report'
If you already have a login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user'
If you want to create a new login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user', 'login', 'password'
UPDATE:
My advice below tells you how to script a DB using SQL Server Management Studio, but the default settings in SSMS miss out all sorts of crucial parts of a database (like indexes and triggers!) for some reason. So, I created my own program to properly script a database including just about every type of DB object you may have added. I recommend using this instead. It's called SQL Server Scripter and it can be found here:
https://bitbucket.org/jez9999/sqlserverscripter
I'm surprised no-one has mentioned this, because it's really useful: you can dump out a database (its schema and data) to a script, using SQL Server Management Studio.
Right-click the database, choose "Tasks | Generate Scripts...", and then select to script specific database objects. Select the ones you want to copy over to the new DB (you probably want to select at least the Tables and Schemas). Then, for the "Set Scripting Options" screen, click "Advanced", scroll down to "Types of data to script" and select "Schema and data". Click OK, and finish generating the script. You'll see that this has now generated a long script for you that creates the database's tables and inserts the data into them! You can then create a new database, and change the USE [DbName] statement at the top of the script to reflect the name of the new database you want to copy the old one to. Run the script and the old database's schema and data will be copied to the new one!
This allows you to do the whole thing from within SQL Server Management studio, and there's no need to touch the file system.
Below is what I do to copy a database from production env to my local env:
Create an empty database in your local sql server
Right click on the new database -> tasks -> import data
In the SQL Server Import and Export Wizard, select product env's servername as data source. And select your new database as the destination data.
Its hard to detach your production dB or other running dB's and deal with that downtime, so I almost always use a Backup / restore method.
If you also want to make sure to keep your login's in sync check out the MS KB article on using the stored proc sp_help_revlogin to do this.
The detach/copy/attach method will take down the database. That's not something you'd want in production.
The backup/restore will only work if you have write permissions to the production server. I work with Amazon RDS and I don't.
The import/export method doesn't really work because of foreign keys - unless you do tables one by one in the order they reference one another. You can do an import/export to a new database. That will copy all the tables and data, but not the foreign keys.
This sounds like a common operation one needs to do with database. Why isn't SQL Server handling this properly? Every time I had to do this it was frustrating.
That being said, the only painless solution I've encountered was Sql Azure Migration Tool which is maintained by the community. It works with SQL Server too.
I run an SP to DROP the table(s) and then use a DTS package to import the most recent production table(s) onto my development box.
Then I go home and come back the following morning. It's not elegant; but it works for me.
If you want to take a copy of a live database, do the Backup/Restore method.
[In SQLS2000, not sure about 2008:] Just keep in mind that if you are using SQL Server accounts in this database, as opposed to Windows accounts, if the master DB is different or out of sync on the development server, the user accounts will not translate when you do the restore. I've heard about an SP to remap them, but I can't remember which one it was.

Table-level backup

How to take table-level backup (dump) in MS SQL Server 2005/2008?
You cannot use the BACKUP DATABASE command to backup a single table, unless of course the table in question is allocated to its own FILEGROUP.
What you can do, as you have suggested, is export the table data to a CSV file. Now in order to get the definition of your table you can 'Script out' the CREATE TABLE script.
You can do this within SQL Server Management Studio, by:
right clicking Database > Tasks > Generate Script
You can then select the table you wish to script out and also choose to include any associated objects, such as constraints and indexes.
in order to get the DATA along with just the schema, you've got to choose Advanced on the set scripting options tab, and in the GENERAL section set the Types of data to script select Schema and Data.
I am using the bulk copy utility to achieve table-level backups
to export:
bcp.exe "select * from [MyDatabase].dbo.Customer " queryout "Customer.bcp" -N -S localhost -T -E
to import:
bcp.exe [MyDatabase].dbo.Customer in "Customer.bcp" -N -S localhost -T -E -b 10000
as you can see, you can export based on any query, so you can even do incremental backups with this. Plus, it is scriptable as opposed to the other methods mentioned here that use SSMS.
Here are the steps you need. Step5 is important if you want the data. Step 2 is where you can select individual tables.
EDIT stack's version isn't quite readable... here's a full-size image http://i.imgur.com/y6ZCL.jpg
You can run the below query to take a backup of the existing table which would create a new table with existing structure of the old table along with the data.
select * into newtablename from oldtablename
To copy just the table structure, use the below query.
select * into newtablename from oldtablename where 1 = 2
This is similar to qntmfred's solution, but using a direct table dump. This option is slightly faster (see BCP docs):
to export:
bcp "[MyDatabase].dbo.Customer " out "Customer.bcp" -N -S localhost -T -E
to import:
bcp [MyDatabase].dbo.Customer in "Customer.bcp" -N -S localhost -T -E -b 10000
If you're looking for something like MySQL's DUMP, then good news: SQL Server 2008 Management Studio added that ability.
In SSMS, just right-click on the DB in question and select Tasks > Generate Scripts. Then in the 2nd page of the options wizard, make sure to select that you'd like the data scripted as well, and it will generate what amounts to a DUMP file for you.
Create new filegroup, put this table on it, and backup this filegroup only.
You can use the free Database Publishing Wizard from Microsoft to generate text files with SQL scripts (CREATE TABLE and INSERT INTO).
You can create such a file for a single table, and you can "restore" the complete table including the data by simply running the SQL script.
I don't know, whether it will match the problem described here. I had to take a table's incremental backup! (Only new inserted data should be copied). I used to design a DTS package where.
I fetch new records (on the basis of a 'status' column) and transferred the data to destination. (Through 'Transform Data Task')
Then I just updated the 'status' column. (Through 'Execute SQL Task')
I had to fix the 'workflow' properly.
Use SQL Server Import and Export Wizard.
ssms
Open the Database Engine
Alt. click the database containing table to Export
Select "Tasks"
Select "Export Data..."
Follow the Wizard
Every recovery model lets you back up
a whole or partial SQL Server database
or individual files or filegroups of
the database. Table-level backups
cannot be created.
From: Backup Overview (SQL Server)
You probably have two options, as SQL Server doesn't support table backups. Both would start with scripting the table creation. Then you can either use the Script Table - INSERT option which will generate a lot of insert statements, or you can use Integration services (DTS with 2000) or similar to export the data as CSV or similar.
BMC Recovery Manager (formerly known as SQLBacktrack) allows point-in-time recovery of individual objects in a database (aka tables). It is not cheap but does a fantastic job:
http://www.bmc.com/products/proddocview/0,2832,19052_19429_70025639_147752,00.html
http://www.bmc.com/products/proddocview/0,2832,19052_19429_67883151_147636,00.html
If you are looking to be able to restore a table after someone has mistakenly deleted rows from it you could maybe have a look at database snapshots. You could restore the table quite easily (or a subset of the rows) from the snapshot. See http://msdn.microsoft.com/en-us/library/ms175158.aspx
A free app named SqlTableZip will get the job done.
Basically, you write any query (which, of course can also be [select * from table]) and the app creates a compressed file with all the data, which can be restored later.
Link:
http://www.doccolabs.com/products_sqltablezip.html
Handy Backup automatically makes dump files from MS SQL Server, including MSSQL 2005/2008. These dumps are table-level binary files, containing exact copies of the particular database content.
To make a simple dump with Handy Backup, please follow the next instruction:
Install Handy Backup and create a new backup task.
Select “MSSQL” on a Step 2 as a data source. On a new window, mark a database to back up.
Select among different destinations where you will store your backups.
On a Step 4, select the “Full” backup option. Set up a time stamp if you need it.
Skip a Step 5 unless you have a need to compress or encrypt a resulting dump file.
On a Step 6, set up a schedule for a task to create dumps periodically (else run a task manually).
Again, skip a Step 7, and give your task a name on a Step 8. You are finished the task!
Now run your new task by clicking on an icon before its name, or wait for scheduled time. Handy Backup will automatically create a dump for your database.
Then open your backup destination. You will find a folder (or a couple of folders) with your MS SQL backups. Any such folder will contains a table-level dump file, consisting of some binary tables and settings compressed into a single ZIP.
Other Databases
Handy Backup can save dumps for MySQL, MariaDB, PostgreSQL, Oracle, IBM DB2, Lotus Notes and any generic SQL database having an ODBC driver. Some of these databases require additional steps to establish connections between the DBMS and Handy Backup.
The tools described above often dump SQL databases as table-level SQL command sequence, making these files ready for any manual modifications you need.

Resources