How to migrate the schema of a SQL Server database from on-premise to SQL Server on AWS RDS with non-supported features - sql-server

I'm working on taking a on-premise server that works with SQL Server 2019 and migrating this to the cloud. The data right now is not the important thing, but rather the schema since this is a proof of concept. The main issue is that the on-premise server uses filestream to sometimes handle files. This will have to change in the future as refactoring and application updates take place.
The easiest way I thought would be to generate a schema .sql script from the old db and run that in the new environment, but this generated a TON of errors (25k).
Most of the errors include:
Failed permissions in database 'master'
Not finding certain objects in the new clean DB
Extended properties are not permitted on an object or it doesn't exist
Invalid data types
Database doesn't exist or permission not allowed
Filestream feature is disabled
So this probably won't work as a drop in solution to get the schema migrated to the new db. I've heard about AWS DMS (data migration service), but I don't know a lot about this. I'm asking, what tools could I look into to migrate over to RDS when RDS doesn't support features native to SQL Server?

One way to import schema is through the generated scripts wizard. You will have to manually tweak some things to make filestream and the local configuration of the sql server work nicely with aws RDS.
Generate and Publish Scripts Guide
Go to the source database
Right click the database in the menu on
the left (Object Explorer) Tasks>Generate Scripts
Select All tables,
procedures, etc.. except for filestream tables.
In the Scripts wizard pop up under Set Scripting Options, choose to make a .sql file, under advanced options, choose Schema Only. This will generate a script with only meta data for the tables and not the data in them
Generate the file.
Copy the .sql file over to the
EC2 instance (probably the Bastion Host) that is connected to the
RDS instance.
Open MS SQL Management Studio and right click on the
top most object in the Object Explorer and open a new query.
Copy and paste the code inside the .sql file into the query window.
Change the file path location of the data and log file to be
D:\rdsdbdata\DATA\TEST_AWS.mdf and D:\rdsdbdata\DATA\TEST_AWS_Log.ldf 
respectively. Any other file location will not be recognized by RDS
and will fail to create the table.
Comment or remove the lines of code that include:
a. ALTER DATABASE [TEST_AWS] SET TRUSTWORTHY OFF  
b. ALTER DATABASE [TEST_AWS] SET HONOR_BROKER_PRIORITY 
c. ALTER DATABASE [TEST_AWS] SET DB_CHAINING OFF Creating global users
d. FileStream
Execute the Script
Consider adding towards the top of the script DROP DATABASE [TEST_AWS] before the creation of the new database just in case you need to run the script multiple times to find the errors. This will save you from overwriting errors or having a unfinished table in memory.

Related

Data Migration from On Prem to Azure SQL (PaaS)

We have an on-prem SQL Server DB (SQL Server 2017 Comp 140) that is about 1.2 TB. We need to do a repeatable migration of just the data to an on cloud SQL (Paas). The on-prem has procedures and functions that do cross DB queries which eliminates the Data Migration Assistant. Many of the tables that we need to migrate are system versioned tables (just to make this more fun). Ideally we would like to move the data into a different schema of a different DB so we can avoid the use of External tables (worried about performance).
Moving the data is just the first step as we also need to do an ETL job on the data to massage it into the new table structure.
We are looking at using ADF but it has trouble with versioned tables unless we turn them off first.
What are other options that we can look and try to be able to do this quickly and repeatedly? Do we need to change to IaaS or use a third party tool? Did we miss options in ADF to handle this?
If I summarize your requirements, you are not just migrating a database to cloud but a complete architecture of your SQL Server, which includes:
1.2 TB of data,
Continuous data migration afterwards,
Procedures and functions for cross DB queries,
Versioned tables
Point 1, 3, and 4 can be done easily by creating and exporting .bacpac file using SQL Server Management Studio (SSMS) from on premises to Azure Blob storage and then importing that file in Azure SQL Database. The .bacpac file that we create in SSMS allows us to include all version tables which we can import at destination database.
Follow this third-party tutorial by sqlshack to migrate data to Azure SQL Database.
The stored procedures can also be moved using SQL Scripts. Follow the below steps:
Go the server in Management Studio
Select the database, right click on it Go to Task.
Select Generate Scripts option under Task
Once its started select the desired stored procedures you want to copy
and create a file of them and then run script from that file to the Azure SQL DB which you can login in SSMS.
The repeatable migration of data is challenging part. You can try it with Change Data Capture (CDC) but I'm not sure that is what exactly your requirement. You can enable the CDC on database level using below command:
Use <databasename>;
EXEC sys.sp_cdc_enable_db;
Refer to know more - https://www.qlik.com/us/change-data-capture/cdc-change-data-capture#:~:text=Change%20data%20capture%20(CDC)%20refers,a%20downstream%20process%20or%20system.

Analysis Services .abf file database restore

I am really really new to SQL Server, I know how to do a query and other simple stuff and recently my company was bought by another one, we had a Cube Server which was accessed by a excel file via olap using the analysis services from sql server 2008 it was updated by an .abf file, first day after the sale the former server was retired, and everything I have access to is this .abf file used to update the cube, I installed sql server 2008 enterprise edition and I'm trying to restore the file to a new database via the analysis services since the only instructions I received from the old IT department is that is needed to be restored via analysis services. I searched online for some solutions and came across several articles and none of the steps worked for me because they required a already configured database and they were only restoring a backup. I'm thinking I need the .mdf file first so I can recreate the database as is and then I can update it via the .abf file, can someone point me in the right direction?
Since you have the .ABF file, there are a couple options to restore this as a new database. You can either create a new database with the same name, then restore this database from the .ABF file with the AllowOverwrite option set to true. You can also restore directly to a new database by right-clicking the SSAS instance and selecting Restore... From here, specify the backup file name and just enter the name of the database and this will be created as a new cube. This name must be a new database name, as if an existing cube is specified it will be overwritten. Either approach can be done through an XMLA command in SSMS and an example of this is below.
<Restore xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<File>C:\YourFilePath\YourCubeBackupFile.abf</File>
<DatabaseName>TargetOrNewDatabaseName</DatabaseName>
<AllowOverwrite>true</AllowOverwrite>
</Restore>
Try attaching the database with the mdf file in the sql server 2008.

How can I generate a script of my database as it is?

My primary reason for this is to keep track of database schema changes for my application. In SQL Server Management Studio I am able to generate a create script which creates the database but it doesn't contain any test data. Ideally when the script is run it should DROP the existing database (assuming it already exists) and then recreating it using this new script containing schema changes and test data from my development machine.
So how can I generate a script that will create a database with all the tables, stored procs, triggers, views, test data, etc?
I've tried using the import/export functionality but that's no good because it doesn't seem to copy over stored procedures. Plus it would be nice to have a script so I can track changes to the schema using mercurial.
I am using SQL Server Express 2008 R2 along with SQL Server Management Studio.
You didn't mention which version of SQL Server, but in SQL 2008 this is very easy
SQL 2008
Expand Databases
Right Click Database
Choose Tasks > Generate Scripts
Generate and
Publish Dialog will open Choose your
objects (i.e. Tables, procs, etc)
Click Next On the Set Scripting
Options choose Advanced Options Under
General choose SCRIPT DROP AND
CREATE - SCRIPT DROP AND CREATE
Types of Data To Script - Schema and
Data Close Advanced Window Choose to
save to file.
I wrote an open source command line utility named SchemaZen that does this. It's much faster than scripting from management studio and it's output is more version control friendly. It supports scripting both schema and data.
To generate scripts run:
schemazen.exe script --server localhost --database db --scriptDir c:\somedir
Then to recreate the database from scripts run:
schemazen.exe create --server localhost --database db --scriptDir c:\somedir
Try Microsoft SQL Server Database Publishing Wizard. This is a powerful flexible tool for scripting schema / data rom SQL Server.
Personally I use Microsoft Visual Studio 2010 Database Project with a Source Control Repository (SVN) to track changes to the schema.
Watch out for difference in database collation. If you develop on a database with a case insensitive collation and try and run the SSMS generated scripts against as database with a case sensitive collation then errors in case will break the scripts.
Usually i make backups fom a database before start a new development on it.
the best way is restore the backup when needed, i don't know how to get it by the script way!

Best way to copy a database (SQL Server 2008)

Dumb question - what's the best way to copy instances in an environment where I want to refresh a development server with instances from a production server?
I've done backup-restore, but I've heard detach-copy-attach and one guy even told me he would just copy the datafiles between the filesystems....
Are these the three (or two, the last one sounds kind of suspect) accepted methods?
My understanding is that the second method is faster but requires downtime on the source because of the detach aspect.
Also, in this situation (wanting an exact copy of production on a dev server) what's the accepted practice for transferring logins,etc.? Should I just backup and restore the user databases + master + msdb?
Easiest way is actually a script.
Run this on production:
USE MASTER;
BACKUP DATABASE [MyDatabase]
TO DISK = 'C:\temp\MyDatabase1.bak' -- some writeable folder.
WITH COPY_ONLY
This one command makes a complete backup copy of the database onto a single file, without interfering with production availability or backup schedule, etc.
To restore, just run this on your dev or test SQL Server:
USE MASTER;
RESTORE DATABASE [MyDatabase]
FROM DISK = 'C:\temp\MyDatabase1.bak'
WITH
MOVE 'MyDatabase' TO 'C:\Sql\MyDatabase.mdf', -- or wherever these live on target
MOVE 'MyDatabase_log' TO 'C:\Sql\MyDatabase_log.ldf',
REPLACE, RECOVERY
Then save these scripts on each server. One-click convenience.
Edit:
if you get an error when restoring that the logical names don't match, you can get them like this:
RESTORE FILELISTONLY
FROM disk = 'C:\temp\MyDatabaseName1.bak'
If you use SQL Server logins (not windows authentication) you can run this after restoring each time (on the dev/test machine):
use MyDatabaseName;
sp_change_users_login 'Auto_Fix', 'userloginname', null, 'userpassword';
The fastest way to copy a database is to detach-copy-attach method, but the production users will not have database access while the prod db is detached. You can do something like this if your production DB is for example a Point of Sale system that nobody uses during the night.
If you cannot detach the production db you should use backup and restore.
You will have to create the logins if they are not in the new instance. I do not recommend you to copy the system databases.
You can use the SQL Server Management Studio to create the scripts that create the logins you need. Right click on the login you need to create and select Script Login As / Create.
This will lists the orphaned users:
EXEC sp_change_users_login 'Report'
If you already have a login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user'
If you want to create a new login id and password for this user, fix it by doing:
EXEC sp_change_users_login 'Auto_Fix', 'user', 'login', 'password'
UPDATE:
My advice below tells you how to script a DB using SQL Server Management Studio, but the default settings in SSMS miss out all sorts of crucial parts of a database (like indexes and triggers!) for some reason. So, I created my own program to properly script a database including just about every type of DB object you may have added. I recommend using this instead. It's called SQL Server Scripter and it can be found here:
https://bitbucket.org/jez9999/sqlserverscripter
I'm surprised no-one has mentioned this, because it's really useful: you can dump out a database (its schema and data) to a script, using SQL Server Management Studio.
Right-click the database, choose "Tasks | Generate Scripts...", and then select to script specific database objects. Select the ones you want to copy over to the new DB (you probably want to select at least the Tables and Schemas). Then, for the "Set Scripting Options" screen, click "Advanced", scroll down to "Types of data to script" and select "Schema and data". Click OK, and finish generating the script. You'll see that this has now generated a long script for you that creates the database's tables and inserts the data into them! You can then create a new database, and change the USE [DbName] statement at the top of the script to reflect the name of the new database you want to copy the old one to. Run the script and the old database's schema and data will be copied to the new one!
This allows you to do the whole thing from within SQL Server Management studio, and there's no need to touch the file system.
Below is what I do to copy a database from production env to my local env:
Create an empty database in your local sql server
Right click on the new database -> tasks -> import data
In the SQL Server Import and Export Wizard, select product env's servername as data source. And select your new database as the destination data.
Its hard to detach your production dB or other running dB's and deal with that downtime, so I almost always use a Backup / restore method.
If you also want to make sure to keep your login's in sync check out the MS KB article on using the stored proc sp_help_revlogin to do this.
The detach/copy/attach method will take down the database. That's not something you'd want in production.
The backup/restore will only work if you have write permissions to the production server. I work with Amazon RDS and I don't.
The import/export method doesn't really work because of foreign keys - unless you do tables one by one in the order they reference one another. You can do an import/export to a new database. That will copy all the tables and data, but not the foreign keys.
This sounds like a common operation one needs to do with database. Why isn't SQL Server handling this properly? Every time I had to do this it was frustrating.
That being said, the only painless solution I've encountered was Sql Azure Migration Tool which is maintained by the community. It works with SQL Server too.
I run an SP to DROP the table(s) and then use a DTS package to import the most recent production table(s) onto my development box.
Then I go home and come back the following morning. It's not elegant; but it works for me.
If you want to take a copy of a live database, do the Backup/Restore method.
[In SQLS2000, not sure about 2008:] Just keep in mind that if you are using SQL Server accounts in this database, as opposed to Windows accounts, if the master DB is different or out of sync on the development server, the user accounts will not translate when you do the restore. I've heard about an SP to remap them, but I can't remember which one it was.

Copy table to a different database on a different SQL Server

I would like to copy a table from one database to another. I know you can easily do the following if the databases are on the same SQL Server.
SELECT * INTO NewTable FROM existingdb.dbo.existingtable;
Is there any easy way to do this if the databases are on two different SQL Servers, without having to loop through every record in the original table and insert it into the new table?
Also, this needs to be done in code, outside of SQL Server Management Studio.
Yes. add a linked server entry, and use select into using the four part db object naming convention.
Example:
SELECT * INTO targetTable
FROM [sourceserver].[sourcedatabase].[dbo].[sourceTable]
If it’s only copying tables then linked servers will work fine or creating scripts but if secondary table already contains some data then I’d suggest using some third party comparison tool.
I’m using Apex Diff but there are also a lot of other tools out there such as those from Red Gate or Dev Art...
Third party tools are not necessary of course and you can do everything natively it’s just more convenient. Even if you’re on a tight budget you can use these in trial mode to get things done….
Here is a good thread on similar topic with a lot more examples on how to do this in pure sql.
SQL Server(2012) provides another way to generate script for the SQL Server databases with its objects and data. This script can be used to copy the tables’ schema and data from the source database to the destination one in our case.
Using the SQL Server Management Studio, right-click on the source database from the object explorer, then from Tasks choose Generate Scripts.
In the Choose objects window, choose Select Specific Database Objects to specify the tables that you will generate script for, then choose the tables by ticking beside each one of it. Click Next.
In the Set Scripting Options window, specify the path where you will save the generated script file, and click Advanced.
From the appeared Advanced Scripting Options window, specify Schema and Data as Types of Data to Script. You can decide from here if you want to script the indexes and keys in your tables. Click OK.
Getting back to the Advanced Scripting Options window, click Next.
Review the Summary window and click Next.
You can monitor the progress from the Save or Publish Scripts window. If there is no error click Finish and you will find the script file in the specified path.
SQL Scripting method is useful to generate one single script for the tables’ schema and data, including the indexes and keys. But again this method doesn’t generate the tables’ creation script in the correct order if there are relations between the tables.
Microsoft SQL Server Database Publishing Wizard will generate all the necessary insert statements, and optionally schema information as well if you need that:
http://www.microsoft.com/downloads/details.aspx?familyid=56E5B1C5-BF17-42E0-A410-371A838E570A
Generate the scripts?
Generate a script to create the table then generate a script to insert the data.
check-out SP_ Genereate_Inserts for generating the data insert script.
Create the database, with Script Database as... CREATE To
Within SSMS on the source server, use the export wizard with the destination server database as the destination.
Source instance > YourDatabase > Tasks > Export data
Data Soure = SQL Server Native Client
Validate/enter Server & Database
Destination = SQL Server Native Client
Validate/enter Server & Database
Follow through wizard

Resources