DevOps or SQLDataFactory pipline to Manipulate SQL Backup - sql-server

Currently I have a PowerShell solution that the client would want in Azure DevOps, I've highlighted the steps below and wanted to know if it's possible to achieve this in either Azure DevOps or SQL Data Factory.
1) Download Backup from Azure BLOB Storage
2) Within Pipeline / Package - Unpack and mount the file.
3) Truncate specific tables
4) Backup to a different BLOB Container
or
1) Download Backup from Azure BLOB Storage to a SQL Server
2) Restore to a SQL Instance within the subscription
3) Truncate specific tables
4) Backup to a different BLOB Container

Related

Migrating SSIS and SQL Server data from AWS to Azure

I have amazon EC2 instance on AWS where SQL server is installed. I want to migrate SQL Server data and SSIS packages to Azure VMs.
I can not effort any loss of information.
What would be the best way to do so ?
I would recommend to use Azure Blob Storage as the transport.
SQL Server out of the box can create backups and can restore databases directly to/from Azure Blob containers
For instance:
CREATE CREDENTIAL sql_backup_credential
--storage account name:
WITH IDENTITY='your-storage-account',
--storage account key from portal:
SECRET = 'pvv99UFQvuLadBEb7ClZhRsf9zE8/OA9B9E2ZV2kuoDXu7hy0YA5OTgr89tEAqZygH+3ckJQzk8a4+mpmjN7Lg=='
BACKUP DATABASE Test
TO URL = N'https://your-storage-account.blob.core.windows.net/your-container/test.bak'
WITH credential = 'sql_backup_credential';
So, this way all user databases + SSISDB can be transferred from one isolated box to another.
Of course, Firewall settings of SQL Server VMs to allow outbound and inbound connections for HTTPS.

How to deploy SQL schema changes to Always On Availability group via Azure pipeline?

We have the following setup - 2 SQL servers(01 & 02) configured as AlwaysOn with a listener node/alias (LT) pointing to which server is currently primary(01/02).
What would be the best way to configure an Azure DevOps pipeline to deploy changes to the SQL servers. Along with deploying the changes the release pipeline also needs to include steps for restoring a new DB backup from another server.
My thought is to configure everything on listener(LT) this includes running the power shell script to create the AzDo client as well. But what will happen once the servers(Primary & Secondary) are switched?

Azure SQL Database and SSIS packages development

I must decide whether to get SQL Server Standard license or subscribe to Azure SQL Database for the needs of a small company. Basically what I need is the possibility to develop SSIS packages for data import from Excel and schedule their execution + develop job(s) for sending automated e-mails to customers. As I have zero administration skills I think Azure services would be a better option, but on the other side I cannot find good information on how to develop SSIS directly under the Azure environment. Would I still need SQL Server for that?
For import data from excel to Azure SQL database with SSIS, you can reference this tutorials: Import data from Excel or export data to Excel with SQL Server Integration Services (SSIS)
This article describes the connection information that you have to provide, and the settings that you have to configure, to import data from Excel or export data to Excel with SQL Server Integration Services (SSIS).
You also need download SQL Server Data Tools (SSDT) to help you create the SSIS package. Reference tutorial: Create Packages in SQL Server Data Tools.
All of these need the SQL server environment support. We can not develop the actual SSIS job in Azure without SQL Server.
You don't need SSIS to import data from Excel files to Azure SQL Database. You just need to schedule upload those Excel documents to Azure Storage Acoount and from there you can use OPENQUERY or BULK INSERT to import them to Azure SQL Database.
First create a SCOPED CREDENTIAL with the secret key of the Storage Account.
CREATE DATABASE SCOPED CREDENTIAL UploadInvoices
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2019-08-31T02:25:19Z&st=2019-07-30T18:25:19Z&spr=https&sig=KS51p%2BVnfUtLjMZtUTW1siyuyd2nlx294tL0mnmFsOk%3D';
Now create an external data source that maps the Storage Account.
CREATE EXTERNAL DATA SOURCE MyAzureInvoices
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'https://newinvoices.blob.core.windows.net',
CREDENTIAL = UploadInvoices
);
Import Excel documents using OPENROWSET.
SELECT * FROM OPENROWSET(
BULK 'week3/inv-2017-01-19.csv',
DATA_SOURCE = 'MyAzureInvoices',
FORMAT = 'CSV',
FORMATFILE='invoices.fmt',
FORMATFILE_DATA_SOURCE = 'MyAzureInvoices'
) AS DataFile;
Using BULK INSERT, specify the container on the Storage Account and file description:
BULK INSERT Colors2
FROM 'week3/inv-2017-01-19.csv'
WITH (DATA_SOURCE = 'MyAzureInvoices',
FORMAT = 'CSV');
You can automate this using Azure Automation to schedule execution of a stored procedure using OPENQUERY and BULK INSERT to import Excel files.

Azure Data Factory from VM SQL Server to File on FTP

This year We moved from hosted servers to Azure VM's, we run two production servers (SQL and IIS). A vital component of our business is bulk transfer of data file. We take customers data from our SQL Server and then write it out to a file (XLS, CSV, XML, PDF, Word, etc.) and then either email these files to customers or in most cases, push them into their FTP server. We also have a few import procedures where we retrieve data files. All of this is currently done with SSIS packages.
We're examining a move to Azure Data Factory as a replacement for SSIS so that we can possibly move to either Azure SQL (if we can work out Broker Services limitations) or an Azure SQL Managed Instance.
I've done some preliminary work with ADF but I saw a couple of posts about lack of FTP support. Is it possible to create/deliver files to FTP and retrieve/consume files from FTP using ADF? Also, almost all of these jobs are automated and we use SQL Agent to run the packages. What is the Azure equivalent for scheduling these jobs to run?
There is automation in ADF but the scheduler is per pipeline. Azure Automation is more powerful and can automate more than one pipeline (Azure Data Factory v2), if needed.
Automation with Azure Data Factory (ADF)
You can receive files from FTP into an Azure Data Factory pipeline: Copy data from FTP server by using Azure Data Factory The idea is that you receive a file via FTP to submit to a particular pipeline activity, and that activity pushes data to an Azure data source. It might be possible to reverse the flow, and send data out.
The Azure SQL Database Managed Instance is the most on-premise like database (PaaS) service but SQL Server deployed on an Azure VM still has more functionality.

Azure SQL Database Corruption - Tables Missing

We have a windows service that provisions Azure SQL databases for our clients. We are using Microsoft.WindowsAzure.Management.Sql API. I have a database named "Green" in a Server located in 'East Asia', Server Version 'v12'. Today we found that my database data are not found. My tables, stored procedures and data are missing.
Restoring deleted Azure SQL Database Using the Azure Portal
To restore a database in the Azure Portal do the following:
Open the Azure Portal.
On the left side of the screen select BROWSE > SQL servers.
Navigate to the server with the deleted database you want to restore and select the server
Scroll down to the operations section of your server blade and select Deleted databases: Restore an Azure SQL database
Select the deleted database you want to restore
Specify a database name, and click Ok:

Resources