Migrate to Amazon SQL Server RDS - sql-server

I have been attempting to move from a regular SQL Server on a Win2008 Server to the SQL Server on Amazon AWS RDS.
I thought an simple backup and restore would work. Though AWS RDS doesn't seem to have access to a file system so the sql scripts all seem to need a local file system on the source and destination server. I attempted a script following
exec sp_addlinkedserver #server='test.xxxx.us-east-1.rds.amazonaws.com'
-- Verify that the servers were linked (lists linked servers)
exec sp_linkedservers
EXEC ('RESTORE DATABASE [orchard] FROM DISK = ''C:\Temp\orchard.bak'' WITH FILE = 1, NOUNLOAD, STATS = 10')
AT [test.xxxx.us-east-1.rds.amazonaws.com]
Any Suggestions would be helpful.

download the free 'SQL Azure Migration Wizard' from CodePlex -- I did a short blog/screencast about this. Be sure to set the 'TO' setting in the wizard to the AWS DNS name and then use 'SQL Server 2008' and not 'SQL Azure'

The official word I got for AWS support on migration of SQL databases using .bak files is that it is not supported. So no more quick restore from .bak files. They offered the official help for migration of existing databases here:
Official AWS database migration guide
And the also gave me an unofficial wink at the Azure database migration tool. Just use it to generate a script of your schema and or data and execute it against your RDS instance. Its a good tool. You will have to import the .bak into a non-RDS SQL server first to do this.
SQL Azure migration tool

You will probably find that the Data-tier Applications BACPAC format will provide you with the most convenient solution. You can use Export to produce a file that contains both the database schema and data. Import will create a new database that is populated with data based on that file.
In contrast to the Backup and Restore operations, Export and Import do not require access to the database server's file system.
You can work with BACPAC files using SQL Server Management Studio or via the API in .Net, Powershell, MSBuild etc.
Note that there are issues using this method to Export and then Import from and to Amazon RDS. As a new database is created on RDS, the following two objects are created within it.
A User with membership in the db_owner role.
The rds_deny_backups_trigger Trigger
During the import, there will be a conflict between the objects included in the BACPAC file and the ones that are added automatically by RDS. These objects are both present in the BACPAC file and automatically created by RDS as the new database is created.
If you have a non-RDS instance of SQL Server handy, then you can Import the BACPAC to that instance, drop the objects above and then export the database to create a new BACPAC file. This one will not have any conflicts when you restore it to an RDS instance.
Otherwise, it is possible to work around this issue using the following steps.
Edit the model.xml file within the BACPAC file (BACPACs are just zip files).
Remove elements with the following values in their Type attributes that are related to the objects listed above (those that are automatically added by RDS).
SqlRoleMembership
SqlPermissionStatement
SqlLogin
SqlUser
SqlDatabaseDdlTrigger
Generate a checksum for the modified version of the model.xml file using one of the ComputeHash methods on the SHA256 class.
Use the BitConverter.ToString() method to convert the hash to a hexadecimal string (you will need to remove the separators).
Replace the existing hash in the Checksum element in the origin.xml file (also contained within the BACPAC file) with the new one.
Create a new BACPAC file by zipping the contents of the original with both the model.xml and origin.xml files replaced with the new versions. Do NOT use System.IO.Compression.ZipFile for this purpose as there seems to be some conflict with the zip file that is produced - the data is not included in the import. I used 7Zip without any problems.
Import the new BACPAC file and you should not have any conflicts with the objects that are automatically generated by RDS.
Note: There is another, related problem with importing a BacPac to RDS using SQL Server Management Studio which I explain here.

I wrote up some step-by-step instructions on how to restore a .bak file to RDS using the SQL Azure Migration Tool based on Lynn's screencast. This is a much simpler method than the official instructions, and it worked well for several databases I migrated.

Use the export wizard in sql server management studio on your source database. Right click on the database > tasks > export data. There is a wizard that walks you through sending the whole database to a remote sql server.

There is a tool designed by AWS that will answer most, if not all, of your compatibility questions - the Schema Conversion Tool for SQL Server: https://docs.aws.amazon.com/SchemaConversionTool/latest/userguide/CHAP_Source.SQLServer.html
Because not all sql server database objects are supported by RDS, and even varies across sql server versions, the Assessment report will be well worth your time as well:
https://docs.aws.amazon.com/SchemaConversionTool/latest/userguide/CHAP_AssessmentReport.html
Lastly, definitely leverage Database Migration Service:
https://aws.amazon.com/dms/

The following article discussing how to Copy Database With Data – Generate T-SQL For Inserting Data From One Table to Another Table is what I needed.
http://blog.sqlauthority.com/2009/07/29/sql-server-2008-copy-database-with-data-generate-t-sql-for-inserting-data-from-one-table-to-another-table/

Related

Data migration for .SQB files to Snowflake

I need to migrate .SQB files to Snowflake.
I have a data relay where MSSQL Server database files are saved in .SQB format (Redgate) and available via sSTP with full backups every week and hourly backups in between.
Our data warehouse is Snowflake and the rest of our data from other sources. I'm looking for the simplest, most cost effective solution to get my data to Snowflake.
My current ETL process is as follows.
AWS EC2 instance (Windows) that downloads the files, applies
Redgate's SQL Backup Converter
(https://documentation.red-gate.com/sbu7/tools-and-utilities/sql-backup-file-converter)
to convert the files to .BAK. This tool requires a license
Restore MS SQL database on the same AWS EC2 Instance
Migrate MS SQL database to Snowflake via Fivetran
Is there a simpler / better solution? I'd love to eliminate the need for the intermediate EC2 if possible.
The .SQB files come from an external vendor and there is no way to have them change the file format or delivery method.
This isn't a full solution to your problem, but it might help to know that you're okay to use the SQL Backup file converter wherever you need to, free of any licensing restrictions. This is true for all of SQL Backup's desktop and command-line tools. Licensing only gets involved when dealing with the Server Components, but once a .SQB file has been created you're free to use SQBConverter.exe to convert it to a .BAK file wherever you need to.
My advice would be to either install SQL Backup on whichever machine you want to use the tooling on, or just copy all the files from an existing installation. Both should work fine, so pick whichever is easiest for you.
(FYI: I'm a current Redgate software engineer and I used to work on SQL Backup until fairly recently.)
You can
Step 1: Export Data from SQL Server Using SQL Server Management Studio.
Step 2: Upload the CSV File to an Amazon S3 Bucket.
Step 3: Upload Data to Snowflake From S3 using COPY INTO command.
You can use your own AWS S3 bucket for this and then create a External Stage pointing to the S3 bucket or You can upload the files into internal Snowflake Stage.
Copy Into from External Stage -
https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html#loading-files-from-a-named-external-stage
Copy Into from an Internal Stage -
https://docs.snowflake.com/en/sql-reference/sql/copy-into-table.html#loading-files-from-an-internal-stage
Creating External Stage-
https://docs.snowflake.com/en/sql-reference/sql/create-stage.html

dacpac file Publish error to LocalDB: "The element cannot be deployed. This element contains state that cannot be recreated in the target database."

Pretty simple you would think, but as I cannot edit the schema in this database I have no idea how to get past this error when I am publishing my dacpac file to my local database. I am trying to take a copy of a database that is hosted in Azure and have it locally for my own development purposes. I am not a sysadmin of the database, but I have complete access to it other than that. It is a production database so I can't mess anything up for obvious reasons.
I had a hell of a time even getting this dacpac file created in the first place. I was getting far more errors/warnings when trying to export as a bacpac file with data (which is what I really want to do, but I can worry about that later).
Here is the command I am trying:
SqlPackage.exe /Action:Publish /SourceFile:" C:\Data\opkCore.dacpac" /TargetConnectionString:"Data Source=(localdb)\MSSQLLocalDB;Initial Catalog=opkCore; Integrated Security=true;"
This is what I used to create the dacpac file:
sqlpackage.exe /Action:Export /ssn:tcp:<MyDatabase>.database.windows.net,1433 /sdn:opkCore /su:<MyUserName> /sp:<MyPassword> /tf:C:\Data\opkCore.bacpac
I have tried other solutions such as:
Export Data-tier Application, but I am limited to only doing it in an Azure container and I am not in control of that. It is a Pay-As-You-Go model which does not support blob storage apparently
Copy Database only works for 2005 and earlier and this is SQL 2019
Deploy Database to Microsoft SQL Server Azure SQL Database
Import Data-tier Application, same problem as #1
Exporting bacpac file using SqlPackage.exe - Errors all over the place that I cannot fix. The database is not mine to mess up
I CAN export tables one at a time, but then I am missing certain bits of schema that work together, so I get errors there also.
I really should be able to just get a local copy of the database in the EXACT same state that it currently is on our production server. Any other ideas for me on how I can do this that will ignore problems with the database and just get me a local copy the EXACT way the database is in production? 3rd party tools that do this or anything?
I decided to just script all the tables and run the script on my new DB. There were a lot of errors, but it did what it could which was 99.99% of the database schema and that is good enough for my purpose. Maybe I will try to get the data exported and imported as well.
EDIT: To export the data I just used SSMS Export data and the destination used was the new LocalDB database I just created from the scripts.

What happens when we publish the database project through visual studio

I have been working on a project which has a database project in it and I used to publish that database when ever I made some changes to the scripts. Now that I noticed that when I publish the database project it builds first and creates a dacpac file and then it publishes after I selects the target database. I am interested in knowing what role does that dacpac file plays in publishing the sql database.
Also I have found this thing when I was trying to read about pro's and con's about dacpac. Is it really works like that?
Link
The biggest problem with DACPACs has to do with the way a data-tier application is released to push version changes from the DAC into SQL Server. This is done by creating a new database with a temporary name, generating the new objects in the database, and then moving all the data from the existing database to the new one. After all the data has been transferred and the post-release scripts run, the existing database is dropped and the new database is given the correct name.
The dacpac file is the compiled build output of the database project. It's analogous to a .dll file built from a C# class library project. All of the information you defined in your database project about your database is stored in the dacpac file, along with information about the relationships between the objects.
When a dacpac file is published, the target database is compared to the dacpac and the tool will figure out what T-SQL to execute to make the target database match the dacpac's definition.
Regarding the article, note that the Data-Tier Application Framework that shipped with SQL Server 2008 R2 was largely rewritten/replaced for SQL Server 2012, so that article, while correct regarding that very old version of the Data-Tier Application Framework, is not correct regarding the tools available today.
The DACPAC file is a Zip file contains an XML representation of your database schema. It does not contain any table data (unless you provide pre-and-post deployment scripts). More information is available here: https://www.simple-talk.com/sql/database-delivery/microsoft-and-database-lifecycle-management-(dlm)-the-dacpac/
When a DACPAC is deployed, the receiving server compares the difference between the current schema and then updates your schema accordingly by generating a change script. However, be careful, as some changes can be very expensive (such as adding a new column in the middle of a table that already has millions of rows).
The article I linked to shows you how you can view the generated change script and see what happens. Repeated here is a snippet that does it:
"%ProgramFiles(x86)%\Microsoft SQL Server"\110\DAC\bin\sqlpackage.exe
/Action:Script
/SourceFile:MyPathAndFileToTheDacPac
/TargetConnectionString:"Server=MyTargetInstance;Database=MyTargetDatabase;Integrated Security=SSPI;"
/OutPutPath:"MyPathAndFile.sql"
Using DACPACs and Database Projects (in SSDT, but do not use SQL Server Management Studio) is the preferred way of pushing database changes now as it is less error-prone than manually redesigning tables using the table designer (which will drop-recreate-and-repopulate tables if you do things like add non-terminal columns to existing tables).
I'm not too familiar with it but played around with some database uploads myself. From what I gathered the dacpac has settings that can be used and uploaded. I found these instructions:
•To create a database project based on a dacpac, create a new SQL Server Database Project in Visual Studio. Then right-click on the project in Solution Explorer and choose "Import -> Data-tier Application (*.dacpac)" and select your dacpac. That will convert the contents of the dacpac into scripts in the project, and if you choose "Import database settings" the database options will be set based on the settings in the dacpac.
Dacpac is A data-tier application (DAC) is a logical database management entity that defines all of the SQL Server objects - like tables, views, and instance objects, including logins – associated with a user’s database. A DAC is a self-contained unit of SQL Server database deployment that enables data-tier developers and database administrators to package SQL Server objects into a portable artifact called a DAC package, also known as a DACPAC. from https://msdn.microsoft.com/en-us/library/ee210546.aspx
hope this helps...

Exporting database on oracle

I have a DB on oracle on Windows Server 2003. How do I export it with all the data and put it into other Windows server?
Use RMAN to take a full backup. Then restore it on the new server.
See Clone using RMAN Article
You can use Oracle Data Pump to export and import database. Quote from documentation:
Oracle Data Pump is a feature of Oracle Database 11g Release 2 that enables very fast bulk data and metadata movement between Oracle databases.
Procedure is like this:
Export existing database using expdp utility
Install Oracle database server on new Windows server
Import database on new server using impdp utility
Check this link: Oracle Data Pump. There you will find complete documentation and examples how to use this utility.
If you are wanting to create an exact copy of an existing database on a new sever of the same operating system (though not necessarily the same O/S version) and the same Oracle version, the quickest and least problematic method is to just copy the database files. This is often referred to as database cloning, and it is a common method DBAs use to setup development and test databases that are intended to be exact duplicates of production databases.
Stop all instances of the database on the existing system. You could login to each instance "as sysdba" using SQLPlus and run the "shutdown immediate" command. You could also stop the Windows Services for the instances. They are named OracleServicesid where "sid" is the instance name. Usually, there is just one instance, but there could be multiple instances to a single database. All instances must be stopped for this procedure.
Locate the database files. Look for an "oradata" folder somewhere below the Oracle root folder and then find the folder for the database sid in there. (There could be multiple oradata folders. You need to find the one that has the folder named for the SID of your database.) There are also the files in the Admin folder for the sid as well as the %ORACLE_HOME%/database folder. If DBCA had been used to create the database, then the location of all of these files varies by the Oracle version.
Once you have identified all of the files for the database, you can use any method at your disposal to copy these files to the same locations on the new server. (Note: The database files, control files, and redo logs must be placed in the same locations (i.e., file system paths) where they exist on the old server. Otherwise, configuration files must be changed and commands must be run to alter the database's internal file paths.) The parameter file (initSID.ora) and server parameter file (spfileSID.ora) must be placed in the %ORACLE_HOME%/database folder.
On the new sever, you must run the oradim utility. (Note: oradim is an Oracle utility that is specific to Windows and is used to create, maintain, and delete instance services.) Here is a sample command:
oradim -new -sid yourdbsid -startmode automatic
Startup the database with SQLPlus, and you should be in business.
This is a general overview of the process, but it should help you get the job done quickly and easily. The problem with other tools is the need to create an empty database on the target server before loading the data by whatever means. If the target server has a different version of Oracle, it will be necessary to run data dictionary scripts to upgrade or downgrade the database. (Note: A downgrade may not always be possible.) If the new server has a different O/S, then the above procedure would require additional steps that would significantly increase its complexity.
It also possible to duplicate a database using RMAN. Google the words "clone oracle database using rman" to get some good sites on how this is done using that tool. If you are not already using RMAN, the procedure I have described above would probably be the way to go.

Tool to copy SQL Server 2008 db to SQL Server 2008 Express?

I have a typical dev scenario: I have a SQL 2008 database that I want to copy every so often to my local instance of 2008 Express so that I can do dev, make changes, etc. to the local copy. I have some constraints though: the source db is part of a live e-commerce site in shared hosting so I can't detach it and the hosting service wants me to pay $5 for each ad hoc back up I invoke.
What I'd like is some tool that I can invoke ad hoc to take a snapshot (complete, not incremental) of the live db that I can then import in my local one. I've tried the SSMS 2008 Copy Database Wizard but it gives me an error saying I can't do that with Express. I tried the Generate Scripts tool and thought that was going to make it - the export to my local disk worked but when I went to import using SQLCMD (the script was 1GB so SSMS errored when I tried to open it there), it told me there was a syntax error a few thousand lines in.
Coming from the MySQL world, this process is trivial. All I want is an analog of mysqldump and then a command-line way to import that file into a db. Surely there's an easy way to do this in the SQL Server world? This seems like the most basic use-case for developers.
[ Yes, I've seen a few other questions here that seem similar but I didn't think they had the same constraints. ]
Best answer: full backup, restore, pay $5. Anything else seems to me like it'd waste a lot more than $5 worth of time.
If they don't charge you to run queries against the database these tools may help. Granted these are not free tools, but are handy on so many fronts it would be worth buying one. These tools can diff your source db and target db both data and structure or just one or the other, and optionally sync the target database to be just like the source.
http://www.innovartis.co.uk/
http://www.red-gate.com/products/sql%5Fdata%5Fcompare/index.htm
Try SQL Dumper.
SQL Server Dumper enables you to dump selected SQL Server database tables into SQL INSERT statements, that are saved as local .sql files and contain all the data required to create a duplicate table, or to be used for backup purposes. You can choose to create an individual .sql file for each table, or combine all selected tables into a single file.
SQL Server Database Publishing Wizard and osql usually do the trick for me with large databases.

Resources