AWS SQL Server Read Replicas - publish from private server - sql-server

Is is possible to use AWS RDS for SQL Server as the destination / target for a read replica?
I have a database that runs on a private MS SQL instance in my data centre and I would like to publish a set of tables to an AWS SQL Server instance so that I can use that instance, and possibly others, to speed up read-only queries.
I know that AWS has restrictions and I wanted to know if the necessary infrastructure exists that would allow me to run a publisher and distributor within my data centre and target an AWS SQL Server database?

You can achieve this with Cloudbasic's RDS SQL Server HADR tool available on the AWS Marketplace: https://aws.amazon.com/marketplace/pp/B00OU0PE5M
Launch it in the same AWS VPC as your RDS SQL Server instance. In the quick setup, make sure to select SQL Server-to-SQL Server replication (as the tool also streams data from SQL Server to Redshift and S3 data lakes). Then to limit publishing to a subset of tables, after testing connections go to advanced settings, exclude tables you do not want to be published.

Related

Scheduled SQL Server Instance Push to Azure SQL Database

I am somewhat surprised (still after all these SQL Server Installed Instance (Windows VM Azure) that pushing data, on a nightly schedule, to an SQL Azure database is not straight forward. I see some articles and direction to 'migrate' schemas and data, but what about a nightly job to push from my SQL Server instance to individual client SQL Azure data stores?
Should I start with SSIS? Azure data factory? Python libraries? Why isn't a connection between the two 'native'?
Again, all links and references so far have been for one time migration. I want the two in a data Eco-system with reliable flow.
John
We do this using SSIS running from the on prem side, because we already have a bunch of SSIS projects hosted on prem, and have yet to migrate anything into azure data factory. We are using SQL authentication to make the connection to the SQL Azure database.

Migrating to Azure Sql Database with external dependencies in another database

The question: Is it possible to point a view in database A running on Azure SQL Database service to tables/views on a SQL server running in a VM? I've tried external tables but come up short.
Scenario:
Two applications that is exchanging data, from two different vendors.
We've got three databases in total. One for each application and an integration database with views that both applications use either directly or through views.
The issue now is that we want to migrate to Azure and would prefer to use the Azure SQL Database service as much as possible, but one of the applications is not ready for it and therefor it's database has to be hosted on either a managed instance or in a VM.
The issue now is that there's view's in database A (running on Azure SQL Database service) that points to views and tables in the integration database, which again references tables in database B (running on SQL Server on the VM).
The short answer is "no". You can use external tables to query other SQL Azure databases, but there is no exact analogue for linked servers in Azure SQL Database. You can use SQL Azure Managed Instance (which supports SQL-SQL linked servers but not arbitrary linked servers).
There is a workaround, however. You can run SQL Server in an Azure VM and have it point to SQL Azure as a target as well as the other sources you want to connect. Then you can push data to/from Azure SQL DB using the SQL Server in a VM. You don't have the same management overhead in this approach since you don't really need to host data in the SQL Server if you don't want to do so. Note that this will be slower than doing direct connections to SQL Azure, but you can try to do this for a period of time if it would help you during a migration.

Is it possible to do SQL server database to other cloud mysql migration with AWS DMS?

AWS DMS has schema conversion tool and other advanced features in database migration.So is it possible to do SQL server database inside a vm to some other cloud's mysql(due to business reasons) migration by using AWS DMS?or the target database should be inside AWS RDS?The source data size is close to 60GB and tables are close to 300.Please advice
the target database should be inside AWS RDS.

Microsoft Azure SQL Database Trace Replay

I'd like to make a direct comparison of a physical server to a serverless Azure SQL environment by using an hour snapshot of all SQL Server activity from some existing infrastructure.
Is it possible to use the SQL Server Profiler to record trace data and replay this on an Azure SQL instance? I am not attempting to tweak or performance tune the existing system, but want to compare how an Azure server at various tiers will perform that workload.
Another option might be to use the SQL Server Distributed Replay functionality if this is an option in Azure.
SQL Server profiler and Distributed Replay are not available.
https://feedback.azure.com/forums/217321-sql-database/suggestions/431943-profiler-for-sql-azure
You may want to try SQL Workload Profiler.
https://cbailiss.wordpress.com/sql-workload-profiler/
SQL Azure Managed Instances will soon have SQL Profiler available:
https://www.youtube.com/watch?v=0uT46lpjeQE&feature=youtu.be&t=1415
your entire question boils down to how can i choose the correct tier in sql azure for my on premises instances..
As alberto points out,there is no way to handle this now directly,for now you can use below workaround
go to http://dtucalculator.azurewebsites.net/ and download powershell script and schedule it for some duration and upload the generated log files to azure website.Once done, you will be provided with chart like below ..
References:
https://www.simple-talk.com/cloud/platform-as-a-service/azure-sql-database-how-to-choose-the-right-service-tier/

Azure Data Factory - Azure SQL Server Destination Schema

I need to copy several hundred tables from an on-premise SQL Server to an Azure SQL Server using ADF. I don't have access to the DB or the network it's on, but I was able to get the on-prem data gateway installed, given an AD account with sufficient DB privileges, and then use the "Copy Data (Preview)" to copy all tables to blob storage.
My problem is that I don't have access to the DB's schema, so I can't easily provision the Azure SQL Server with the necessary tables/columns since there are several hundred tables & performing manually would be extremely time consuming. I found that copying to an Azure Data Warehouse has a "Auto table creation" feature & I am able to copy from on-prem SQL Server directly to Azure DW without defining a schema at the destination, but this isn't supported on Azure SQL Server.
Is there a way to obtain the same script/method that provisions the Azure DW schema & use it for Azure SQL Server? Is there any other way to obtain the source DB's schema via the on-prem data gateway?
given that you were able to run the Copy Data tool to extract data out of on-premises SQL Server, you must have credentials to access the database. Can you run SSMS (SQL Server Management Studio) on the on-prem data gateway and examine/extract the schema?
"Auto table creation" feature is currently only available for Azure Data Warehouse. Supporting this feature when loading into Azure SQL Server is on our backlog but we don't have a committed timeline for this yet.
Shirley Wang
Couldn't you use DMG to run a query against the database to generate the schema for your tables, assuming your AD account has read access to the metadata?
So with your pipeline, instead of it selecting * from each table, have it run a query to exract the schema, some examples here: How can I show the table structure in SQL Server query?
You would then output that to a blob.

Resources