Get local copy of SQL Server hosted on Amazon RDS - sql-server

I have a small (few hundred MB) SQL Server database running on RDS. I've spent several hours trying to get a copy of it onto my local SQL Server 2014 instance. All of the following fail. Any ideas what might work?
Task -> Backup fails because it doesn't give my admin account permission to backup to a local drive.
Copy Database fails during create package with While trying to find a folder on SQL an OLE DB error was encountered with error code 0x80040E4D
From SSMS, while connected to the RDS server, running BACKUP DATABASE. This fails with message BACKUP DATABASE permission denied in database 'MyDB'. Even after running EXEC sp_addrolemember 'db_backupoperator' for the connected user.
General scripts generates a 700MB .sql file. Running that with sqlcmd -i fails at some point after producing plausible .mdf and .ldf files that can't be mounted on the local server (probably because the sqlcmd failed to complete and unlock them).

AWS has finally provided a reasonably easy means of doing this: It requires an S3 bucket.
After creating a bucket called rds-bak I ran the following stored procedure in the RDS instance:
exec msdb.dbo.rds_backup_database
#source_db_name='MyDatabase',
#s3_arn_to_backup_to='arn:aws:s3:::rds-bak/MyDatabase.bak',
#overwrite_S3_backup_file=1;
The following stored procedure returns the status of the backup request:
exec msdb.dbo.rds_task_status #db_name='MyDatabase'
Once it finished I downloaded the .bak file from S3 and imported it into a local SQL Server instance using the SSMS Restore Database... wizard!

The SSIS Import Export Wizard can generate a package to duplicate a whole set of tables. (It's not the sort of Copy Database function that relies on files - it makes a package with data flow components for each table.)
It's somewhat brittle but can be made to work :-)
SSMS Generate Scripts feature can often fail with any large data set as the script for all the data is just to large/verbose. This method never scripts out the data.

Check this out: https://github.com/andrebonna/RDSDump
It is a C#.NET Console Application that search for the latest origin database Snapshot, restore it on a temporary RDS instance, generate a BACPAC file, upload it to S3 and delete the temporary RDS instance.
You can transform your RDS snapshot into a BACPAC file, that can be downloaded and imported onto your local SQL Server 2014 instance using the feature answered here (Azure SQL Database Bacpac Local Restore)

Redgate's SQL Compare and SQL Data Compare are invaluable for these types of things. They are not cheap (but worth every penny imo). But if this is a one-time thing, you could use the 14 day trial and see how it behaves for you.
http://www.red-gate.com/products/

Related

Where to find the .bak file after RDS native backup

So I followed the AWS documentation to perform a native RDS backup using MS-SQL Server. My goal is to be able to download the .bak file.
The config seems to be correct, and I was able to execute the backup stored procedure:
And I created the the option group and have the S3 bucket linked to it.
But when I went to the S3 bucket, the .bak file is not there, even the stored procedure is performed successfully.
Try using the rds_task_status stored procedure to see if any errors occurred during native backup - exec msdb.dbo.rds_task_status #db_name='aa144bgo6mn8srl'. This will produce a table of sync statuses.
Do you see a lifecycle of Completed when you run this query?

AWS SQL Server 2016 Restoring 2 Databases Error Message

I am testing to see if a SQL Server server based program can also work on AWS Cloud Server with 2016 SQL Server on the Amazon server. In order for me to test it, I need to restore 2 databases.
The first one eventually restored fine once i figured it out...restoring the database from my S3 "bucket" BAK file.
So then I tried to restore the 2nd database, using the same restore stored proceudre, and get this message:
[2017-12-28 02:44:22.320] The file 'D:\rdsdbdata\DATA\smsystemdata.mdf' cannot be overwritten. It is being used by database 'amwsys'.
[2017-12-28 02:44:22.320] File 'sm_system_data' cannot be restored to 'D:\rdsdbdata\DATA\smsystemdata.mdf'. Use WITH MOVE to identify a valid location for the file.
I can't find where to use the WITH MOVE because it won't let me restore it interactively through the Management Studio restore menu; instead I have to give it a stored procedure command:
exec msdb.dbo.rds_restore_database
#restore_db_name='sample99',
#s3_arn_to_restore_from='arn:aws:s3:::lighthouse-chicago/sample999.bak';
And each time it tells me it can't restore it because it's going to overwrite the first database's files.
Much thanks
bill
I think you are stuck in RDS's restriction.
I had the similar problem as you. Multiple restore from one DB instance is impossible at RDS.
Here is RDS's restriction you may encounter.
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/SQLServer.Procedural.Importing.html
You can't restore a backup file to the same DB instance that was used
to create the backup file. Instead, restore the backup file to a new
DB instance. Renaming the database is not a workaround for this
limitation.
You can't restore the same backup file to a DB instance multiple
times. That is, you can't restore a backup file to a DB instance that
already contains the database that you are restoring. Renaming the
database is not a workaround for this limitation.
If you are in this case, you can't use .BAK file. To avoid it, you should create DB instance with DML and import table data.

Automating import of data-tier application (SQL database) from Azure with a Master Key

When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!

SQL Server Database Primary Data File got lost

SQL Server 2008 R2 stopped all of a sudden due to (maybe) Power Fluctuation.
I tried all the possible ways to restart the it but every time it is failing with the error
The request failed or the service did not respond in a timely fashion.
Some of the ways I tried are
Making the SQL Server to log On as "Local System" instead of "NetworkService"
Replacing of Master.mdf and mastlog.ldf files from the "Bin/Templates" folder
Disabling "VIA" (which was already disabled)
But all in-vain :(
On checking further I noticed that both the data files i.e. mydb.mdf and mydb.ldf of my database are not there in the DATA folder and instead there are mydb_1.ndb and mybd_2.ldf files.
How to recover mydb.mdf file and to restart the SQL Server?
Thank you.
sql data files can be named anything so the mydb_1.ndb could be your data file.
If that's true you should be able to recover the data by:
Install a new sql server (sql express would work if the DB is < 10GB)
move the mydb_1.ndb and mybd_2.ldf onto that server
Use "Attach..." from ssms to add the database to the new server
If you are lucky and that ndb is just a differently named mdf file you should be able to access the data.
Then you can repair your existing server (reinstall will be easier than messing with the master database unless you've got other dbs on there) and move the database back over i.e. do the same attach... method
Oh - and start backing it up :)

SQL Server Copy Database Issue

I'm running the copy database wizard on a 2008 R2 instance of SQL Server.
The database I want to copy is a SQL 2000 database.
I'm copy that database to another SQL SErver 2008 R2.
The wizard uses SQL authentication for both servers, and both are sysadmins.
When I run it, I get the following error (FYI I have tried both copying the logins and leaving them out):
Event Name: OnError
Message: ERROR : errorCode=-1073548784 description=Executing the query "sys.sp_addrolemember #rolename = N'RandomRoleName..." failed with the following error: "The role 'RandomRoleName' does not exist in the current database.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
helpFile= helpContext=0 idofInterfaceWithError={C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
StackTrace: at Microsoft.SqlServer.Management.Dts.DtsTransferProvider.ExecuteTransfer()
at Microsoft.SqlServer.Management.Smo.Transfer.TransferData()
at Microsoft.SqlServer.Dts.Tasks.TransferObjectsTask.TransferObjectsTask.TransferDatabasesUsingSMOTransfer()
Any help would be appreciated!
Jim
My suggestion is dont use the copy database wizard. Create a full backup of the database on the 2000 server and then restore it on the 2008 server.
If you google "Microsoft.SqlServer.Management.Dts.DtsTransferProvider.ExecuteTransfer Copy Database Wizard" you will find that many many people have gotten this same error or other nearly identical smo errors... no-one appears to have gotten past it.
That's isn't to say its impossible... just, restoring a backup is so much easier then the wizard or troubleshooting the wizard. Good luck.
The copy wizard had missed some security and IIRC it's caused by subtle differences in security tables, principals etc between the 2 versions.
Frankly, the easiest way is to do one of these two:
backup/restore
detach, copy, attach
If you don't have access to the O/S and can't get it, another option is to create the missing role(s) in the background as the copy runs. You have to catch it between the creation of the files and when it tries to reference the roles, but there are a few seconds in which to create them if you keep clicking execute - I managed to create 9 roles.
Unfortunately, you'll end up with the roles in another database too (while yours cannot be used) so those need to be deleted.
Of course, this is only an option when you really can't use the other method.
Though the answer which is using the backup technique solves the problem generally, after facing the same issue several times, I was able to trace down the root of the problem using the Event Viewer of Windows to that the Database Copy wizard, using the SQL Agent, will eventually create a Job for the agent to run, after which the Agent will run using its own credentials (i.e. the credentials that you can look up in Windows Services, in my case, NT Service\SQLAgent$SQL2014)
All you need to do is to go the folder where SQL Server creates DB files (e.g. C:\Program Files\Microsoft SQL Server\MSSQL12.SQL2014\MSSQL\DATA by default for SQL 2014) and give the SQL Agent windows user write/read access on the folder.
The reason can be that a file with the new Database name already exist on the filesystem. We encountered this when we renamed Database X to X_Old, and tried to copy database Y to X. This cannot be done, because database X_Old is still associated with the filename X.
Either delete the conflicting database, or rename the file on the file system.
See http://codecopy.wordpress.com/2012/01/03/error-while-copying-a-database/

Resources