Create local postgreSQL database with back up from database in Server - database

In a project that I support, we already have PostgreSQL database in different environments - Development, Integration and. production.
I know we can take a back up of Integration database with PG_Dump and restore it to Development in order to sync those databases.
However, I want to understand if I can use the back up file from PG_dump to create the database locally in my system?

A "local database" is not substantially different from a "remote database", so yes, that should work.
As always, note that restoring a dump on a lower PostgreSQL version than the one where it was taken is not supported (and will often fail).

Related

Edit SQL Server backup file to change database and file paths to allow restoring multiple times to same Amazon RDS instance as different databases

Goal: Backup and Restore a SQL Server database multiple times onto an Amazon RDS SQL Server instance with different database and file names.
So Amazon RDS added the ability to access SQL Server database backups and "import" and "export", yay! But you can't change the database name or the file names, boo!
For non-production databases, I want to put them on a single RDS instance, e.g. dev, test, integration, etc. since I don't need much performance and it would save a lot of money.
I have been seeking to come up with a solution for cloning a database onto an Amazon RDS instance, specifying the database name. I don't want to (i.e. not allowed to) spend $6000 for Red Gate SQL Clone. Trying to hack a combination of scripting, bcp, import/export, etc is likely going to take a lot of time.
With the introduction of import/export a database in RDS via SQL backups, I have a new option. The problem is I can't specify database and filenames on "import"(restore).
I thought about writing a script that gets the database backup from RDS, restores it to a local SQL Server Express instance specifying the database name and files that I'll want on the destination, then backup this, then import/restore to Amazon. This is an option but it will take WAY longer than is probably practical.
So... my final thought at this point and my question: is there a reliable way to simply edit/patch the backup file to change the database and file names?
Even if you could afford SQL CLone, I'm not sure it would function on AWS as I believe it requires Windows Hyper-V, which isn't supported on Windows Server VMs on AWS.
Windocks has also just released support for SQL Server cloning, but they also use Hyper-V based approach . . . so if you have options outside of AWS I believe their solution fits your budget . . . but again, not on AWS.
Disclosure: I am the Co-Founder of WinDocks

Copying production database using Mysql

I have a huge production DB of around 30GB using Mysql present in remote machine. I want to make a copy of that DB on my local Mysql setup. But I don't want to use SQL dump files.
Is there any alternative to make a copy of production DB to my local machine without using SQL dump files? Please tell me.
Thanks in advance.
If you are using MySQL 5.x - you may use replication mechanism to make "mirror" database. You can run replication, then stop slave database and back it up very fast without stopping master database.
If you want to use it for backup - you can find more information here:
Using Replication for Backups at dev.mysql.com

Restore to remote database from my local machine

I am registered at Go Daddy and want to restore the database there from my local machine. The tool they provide me with doesn't work unless it's from them. I'm trying to restore from my local SQL server, but when I browse I can't restore the local files of the remote database.
They're intentionally preventing users from restoring backups that are "foreign" to them in order to satisfy an obscure Microsoft security recommendation.
You will have to perform a schema comparison and a data comparison between your local machine and the empty database on the hosting to generate the scripts to re-create all of the objects and data. (Having those scripts available in a source control storage would also be helpful.)
There are basically two ways to restored the Database using Sql Server.
Through SSMS Restore Utility.
Manual Database restore Script from here

Tool to copy SQL Server 2008 db to SQL Server 2008 Express?

I have a typical dev scenario: I have a SQL 2008 database that I want to copy every so often to my local instance of 2008 Express so that I can do dev, make changes, etc. to the local copy. I have some constraints though: the source db is part of a live e-commerce site in shared hosting so I can't detach it and the hosting service wants me to pay $5 for each ad hoc back up I invoke.
What I'd like is some tool that I can invoke ad hoc to take a snapshot (complete, not incremental) of the live db that I can then import in my local one. I've tried the SSMS 2008 Copy Database Wizard but it gives me an error saying I can't do that with Express. I tried the Generate Scripts tool and thought that was going to make it - the export to my local disk worked but when I went to import using SQLCMD (the script was 1GB so SSMS errored when I tried to open it there), it told me there was a syntax error a few thousand lines in.
Coming from the MySQL world, this process is trivial. All I want is an analog of mysqldump and then a command-line way to import that file into a db. Surely there's an easy way to do this in the SQL Server world? This seems like the most basic use-case for developers.
[ Yes, I've seen a few other questions here that seem similar but I didn't think they had the same constraints. ]
Best answer: full backup, restore, pay $5. Anything else seems to me like it'd waste a lot more than $5 worth of time.
If they don't charge you to run queries against the database these tools may help. Granted these are not free tools, but are handy on so many fronts it would be worth buying one. These tools can diff your source db and target db both data and structure or just one or the other, and optionally sync the target database to be just like the source.
http://www.innovartis.co.uk/
http://www.red-gate.com/products/sql%5Fdata%5Fcompare/index.htm
Try SQL Dumper.
SQL Server Dumper enables you to dump selected SQL Server database tables into SQL INSERT statements, that are saved as local .sql files and contain all the data required to create a duplicate table, or to be used for backup purposes. You can choose to create an individual .sql file for each table, or combine all selected tables into a single file.
SQL Server Database Publishing Wizard and osql usually do the trick for me with large databases.

Transfer objects and data between SQL 2005 databases

I am wanting to transfer objects (tables, stored procedures, data etc) between two servers (Dev box and Live box) and was wondering what the best approach for doing this is?
In SQL Server 2000, you could transfer all objects and data between databases. Now all there is is 'copy data' and 'write a query'. Where has the second option gone?
Both databases are SQL 2005 (with service pack 2). When transferring, primary keys and relationships should be kept intact as well as all the views and other associated data with regards to ASP.NET authentication. Integration Services is not setup up on the live server, so that is not an option.
The only way I can think of is generating scripts, then running them on the other server, but that is more time consuming than the old way (this is how I am doing it now).
If you are willing to pay, I recommend Sql Compare and Sql Data Compare from Red Gate.
Very useful products.
Database Publishing Wizard
http://sqlhost.codeplex.com/
It's a shame you haven't got Integration Services installed as you could use the "Copy Database Wizard". I believe this creates an SSIS package that runs on the destination server.
If you have Visual Studio 2008, you could try the Data comparison and Schema comparison tools.
Your best bet is probably a schema & data comparison tool; there's various tools listed at http://www.mssqltips.com/tip.asp?tip=1069
You don't mention the scope of your application or the number of developers, etc., so it is a little hard to make any recommendations. However, if your development consists of multiple concurrent projects and multiple developers and you are copying from a Development to Production I would recommend something like the following:
implement 3 "areas": dev, qa, production.
develop all changes in dev, create all changes in scripts, use something like cvs or sourcesafe to track changes on all objects
when changes are ready and tested, run your scripts in qa, this will validate your scripts and install procedure
when ready run your scripts and install procedure on production
note: qa is almost identical to production, except applied changes waiting for their final production install. dev contains any work in progress changes, extra debug junk, etc. You can periodically restore a production backup onto qa and dev to resync them (just make sure all developers are aware of this and plan accordingly), because (depending on the number of developers) they (production vs. qa vs. dev) will start to incur more differences over time.

Resources