Data Migration from On Prem to Azure SQL (PaaS) - sql-server

We have an on-prem SQL Server DB (SQL Server 2017 Comp 140) that is about 1.2 TB. We need to do a repeatable migration of just the data to an on cloud SQL (Paas). The on-prem has procedures and functions that do cross DB queries which eliminates the Data Migration Assistant. Many of the tables that we need to migrate are system versioned tables (just to make this more fun). Ideally we would like to move the data into a different schema of a different DB so we can avoid the use of External tables (worried about performance).
Moving the data is just the first step as we also need to do an ETL job on the data to massage it into the new table structure.
We are looking at using ADF but it has trouble with versioned tables unless we turn them off first.
What are other options that we can look and try to be able to do this quickly and repeatedly? Do we need to change to IaaS or use a third party tool? Did we miss options in ADF to handle this?

If I summarize your requirements, you are not just migrating a database to cloud but a complete architecture of your SQL Server, which includes:
1.2 TB of data,
Continuous data migration afterwards,
Procedures and functions for cross DB queries,
Versioned tables
Point 1, 3, and 4 can be done easily by creating and exporting .bacpac file using SQL Server Management Studio (SSMS) from on premises to Azure Blob storage and then importing that file in Azure SQL Database. The .bacpac file that we create in SSMS allows us to include all version tables which we can import at destination database.
Follow this third-party tutorial by sqlshack to migrate data to Azure SQL Database.
The stored procedures can also be moved using SQL Scripts. Follow the below steps:
Go the server in Management Studio
Select the database, right click on it Go to Task.
Select Generate Scripts option under Task
Once its started select the desired stored procedures you want to copy
and create a file of them and then run script from that file to the Azure SQL DB which you can login in SSMS.
The repeatable migration of data is challenging part. You can try it with Change Data Capture (CDC) but I'm not sure that is what exactly your requirement. You can enable the CDC on database level using below command:
Use <databasename>;
EXEC sys.sp_cdc_enable_db;
Refer to know more - https://www.qlik.com/us/change-data-capture/cdc-change-data-capture#:~:text=Change%20data%20capture%20(CDC)%20refers,a%20downstream%20process%20or%20system.

Related

How to migrate the schema of a SQL Server database from on-premise to SQL Server on AWS RDS with non-supported features

I'm working on taking a on-premise server that works with SQL Server 2019 and migrating this to the cloud. The data right now is not the important thing, but rather the schema since this is a proof of concept. The main issue is that the on-premise server uses filestream to sometimes handle files. This will have to change in the future as refactoring and application updates take place.
The easiest way I thought would be to generate a schema .sql script from the old db and run that in the new environment, but this generated a TON of errors (25k).
Most of the errors include:
Failed permissions in database 'master'
Not finding certain objects in the new clean DB
Extended properties are not permitted on an object or it doesn't exist
Invalid data types
Database doesn't exist or permission not allowed
Filestream feature is disabled
So this probably won't work as a drop in solution to get the schema migrated to the new db. I've heard about AWS DMS (data migration service), but I don't know a lot about this. I'm asking, what tools could I look into to migrate over to RDS when RDS doesn't support features native to SQL Server?
One way to import schema is through the generated scripts wizard. You will have to manually tweak some things to make filestream and the local configuration of the sql server work nicely with aws RDS.
Generate and Publish Scripts Guide
Go to the source database
Right click the database in the menu on
the left (Object Explorer) Tasks>Generate Scripts
Select All tables,
procedures, etc.. except for filestream tables.
In the Scripts wizard pop up under Set Scripting Options, choose to make a .sql file, under advanced options, choose Schema Only. This will generate a script with only meta data for the tables and not the data in them
Generate the file.
Copy the .sql file over to the
EC2 instance (probably the Bastion Host) that is connected to the
RDS instance.
Open MS SQL Management Studio and right click on the
top most object in the Object Explorer and open a new query.
Copy and paste the code inside the .sql file into the query window.
Change the file path location of the data and log file to be
D:\rdsdbdata\DATA\TEST_AWS.mdf and D:\rdsdbdata\DATA\TEST_AWS_Log.ldf 
respectively. Any other file location will not be recognized by RDS
and will fail to create the table.
Comment or remove the lines of code that include:
a. ALTER DATABASE [TEST_AWS] SET TRUSTWORTHY OFF  
b. ALTER DATABASE [TEST_AWS] SET HONOR_BROKER_PRIORITY 
c. ALTER DATABASE [TEST_AWS] SET DB_CHAINING OFF Creating global users
d. FileStream
Execute the Script
Consider adding towards the top of the script DROP DATABASE [TEST_AWS] before the creation of the new database just in case you need to run the script multiple times to find the errors. This will save you from overwriting errors or having a unfinished table in memory.

Oracle to SQLServer export

I have to move data from existing database oracle to which I don't have direct access. The data is about 11 tables, 5GB each. The database admin can export the tables to some .csv or xml. The problem with csv is that some data is textual with lots of special characters. The problem with xml is that the markup is an overhead which will increase significantly the size of the files. The DBA admin is not competent enough to provide a working and neat solution. He uses toad as the database tool. Can you provide some ideas how to perform such a migration in the best possible way?
Please refer the below steps to migrate the data from Oracle to SQL server.
Recommended Migration Process
To successfully migrate objects and data from Oracle databases to SQL Server, Azure SQL DB, or Azure SQL Data Warehouse, use the following process:
1.Create a new SSMA project.
2.After you create the project, you can set project conversion, migration, and type mapping options. For information about project settings, see Setting Project Options (OracleToSQL). For information about how to customize data type mappings, see Mapping Oracle and SQL Server Data Types (OracleToSQL).
3.Connect to the Oracle database server.
4.Connect to an instance of SQL Server.
5.Map Oracle database schemas to SQL Server database schemas.
6.Optionally, Create assessment reports to assess database objects for conversion and estimate the conversion time.
7.Convert Oracle database schemas into SQL Server schemas.
8.Load the converted database objects into SQL Server.
You can do this in one of the following ways:
* Save a script and run it in SQL Server.
* Synchronize the database objects.
9. Migrate data to SQL Server.
10.If necessary, update database applications.
For more details :
[https://learn.microsoft.com/en-us/sql/ssma/oracle/migrating-oracle-databases-to-sql-server-oracletosql?view=sql-server-2017]
After the admin export data into CSV, try to convert it into a character set which will recognize all special characters.
Then, try to follow the steps from this link: link, it might work.
If after the import, there are still special characters, thy to manually convert them.
Get the DBA to export the tables using the ASCII delimiters which were designed for this purpose:
Row delimiter: Decimal 30 / 0x1E
Column delimiter: Decimal 31 / 0x1F
Then you can use BCP (or any other similar product) to upload the data to SQL Server.

SSIS Data Transfer from Azure SQL to On-premises SQL Server 2016

Is there any way to transfer the large volume of data from Azure SQL to on-premises SQL Server 2016 Enterprise/Standard? The requirements prescribed as follows:
Weekly full database transfer
Daily delta transfer before midnight
I read about SSIS for Azure Blob Storage but am not sure whether it is applicable to this context.
Updated: I found an article on Azure Data Sync; according to that article, it seems doable. Please share your experiences. That would be extremely helpful.
https://www.mssqltips.com/sqlservertip/3062/understanding-sql-data-sync-for-sql-server/
Weekly full database transfer
SSIS Doesn't provide a way to do Full transfer of data(i mean backup),unless you want to truncate and insert from source..
For Weekly full database transfer,i would go with SQLAzure Export/Import functionality
Refer below links for more details..
1.https://github.com/richorama/SQLDatabaseBackup
2.I need to automate SQL Azure database backup in SQL Script files. How can i do so?
Daily delta transfer before midnight
You will need a way to identify delta..so create a table with all table names and last run time
create a console application which uses bulk insert functionality,which uses above table as base and insert in onpremises

Compare millions of records from Oracle to SQL server

I have an Oracle database and a SQL Server database. There is one table say Inventory which contains millions of rows in both database tables and it keeps growing.
I want to compare the Oracle table data with the SQL Server data to find out which records are missing in the SQL Server table on daily basis.
Which is best approach for this?
Create SSIS package.
Create Windows service.
I want to consume less resource to achieve this functionality which takes less time and less resource.
Eg : 18 millions records in oracle and 16/17 millions in SQL Server
This situation of two different database arise because two different application online and offline
EDIT : How about connecting SQL server from oracle through Oracle Gateway to SQL server to
1) Direct query to SQL server from Oracle to update missing record in SQL server for 1st time.
2) Create a trigger on Oracle which gets executed when record is deleted from Oracle and it insert deleted record in new oracle table.
3) Create SSIS package to map newly created oracle table with SQL server to update SQL server record.This way only few records have to process daily through SSIS.
What do you think of this approach ?
I would create an SSIS package and load the data from the Oracle table use a Data Flow / OLE DB Data Source. If you have SQL Enterprise, the Attunity Connectors are a bit faster.
Then I would load key from the SQL Server table into a Lookup transformation, where I would match the 2 sources on the key, and direct unmatched rows into a separate output.
Finally I would direct the unmatched rows output to a OLE DB Command, to update the SQL Server table.
This SSIS package will require a lot of memory, but as the matching is done in memory with minimal IO, it will probably outperform other solutions for speed. It will need enough free memory to cache all the keys from the SQL Server Table.
SSIS also has the advantage that it has lots of other transformation functions available if you need them later.
What you basically want to do is replication from Oracle to SQL Server.
You could do this in SSIS, A windows Service or indeed a multitude of platforms.
The real trick is using the correct design pattern.
There are two general design patterns
Snapshot Replication
You take all records from both systems and compare them somewhere (so far we have suggestions to compare in SSIS or compare on Oracle but not yet a suggestion to compare on SQL Server, although this is valid)
You are comparing 18 million records here so this is a lot of work
Differential replication
You record the changes in the publisher (i.e. Oracle) since the last replication then you apply those changes to the subscriber (i.e. SQL Server)
You can do this manually by implementing triggers and log tables on the Oracle side, then use a regular ETL process (SSIS, command line tools, text files, whatever), probably scheduled in SQL Agent to apply these to the SQL Server.
Or you could do this by using the out of the box replication capability to set up Oracle as a publisher and SQL as a subscriber: https://msdn.microsoft.com/en-us/library/ms151149(v=sql.105).aspx
You're going to have to try a few of these and see what works for you.
Given this objective:
I want to consume less resource to achieve this functionality which takes less time and less resource
transactional replication is far more efficient but complicated. For maintenance purposes, which platforms (.Net, SSIS, Python etc.) are you most comfortable with?
Other alternatives:
If you can use Oracle gateway for SQL Server then you do not need to transfer data and can make the query directly.
If you can't use Oracle gateway, you can use Pentaho data integration or another ETL tool to compare tables and get results. Is easy to use.
I think the best approach is using oracle gateway.Just follow the steps. I have similar type of experience.
Install and Configure Oracle Database Gateway for SQL Server.
https://docs.oracle.com/cd/B28359_01/gateways.111/b31042/installsql.htm
Now you can create a dblink from oracle to sql server.
Create a procedure which compare the missing records in oracle database and insert into sql server database.
For example, you can use this statement inside your procedure.
INSERT INTO "dbo"."sql_server_table"#dblink_name("column1","column2"...."column5")
VALUES
(
select column1,column2....column5 from oracle_table
minus
select "column1","column2"...."column5" from "dbo"."sql_server_table"#dblink_name
)
Create a scheduler which execute the procedure daily.
When both databases are online, missing records will be inserted to sql server. Otherwise the scheduler fail or you can execute the procedure manually.
It takes minimum resource.
I will suggest having a homemade ETL solution.
Schedule an oracle job to export source table data (on a daily
manner based on the application logic ) to plain CSV format.
Schedule a SQL-Server job (with acceptable delay from first oracle job) to read this CSV file and import it
to a medium table inside sql-servter using BULK INSERT.
Last part of the SQL-Server job will be reading medium table data
and do the logic(insert, update target table). I suggest having another table to store reports of this daily job result.

How can I copy data records between two instances of an SQLServer database

I need to copy some records from our SQLServer 2005 test server to our live server. It's a flat lookup table, so no foreign keys or other referential integrity to worry about.
I could key-in the records again on the live server, but this is tiresome. I could export the test server records and table data in its entirety into an SQL script and run that, but I don't want to overwrite the records present on the live system, only add to them.
How can I select just the records I want and get them transferred or otherwise into the live server? We don't have Sharepoint, which I understand would allow me to copy them directly between the two instances.
If your production SQL server and test SQL server can talk, you could just do in with a SQL insert statement.
first run the following on your test server:
Execute sp_addlinkedserver PRODUCTION_SERVER_NAME
Then just create the insert statement:
INSERT INTO [PRODUCTION_SERVER_NAME].DATABASE_NAME.dbo.TABLE_NAME (Names_of_Columns_to_be_inserted)
SELECT Names_of_Columns_to_be_inserted
FROM TABLE_NAME
I use SQL Server Management Studio and do an Export Task by right-clicking the database and going to Task>Export. I think it works across servers as well as databases but I'm not sure.
An SSIS package would be best suited to do the transfer, it would take literally seconds to setup!
I would just script to sql and run on the other server for quick and dirty transferring. If this is something that you will be doing often and you need to set up a mechanism, SQL Server Integration Services (SSIS) which is similar to the older Data Transformation Services (DTS) are designed for this sort of thing. You develop the solution in a mini-Visual Studio environment and can build very complex solutions for moving and transforming data.

Resources