Snowflake integration with Azure Dev Ops - snowflake-cloud-data-platform

Do you have information on how to integrate Snowflake with Azure devops for ci/cd.I see not much of information on docs.snowflake.com, I am interested in step by step process or guide of implementing the Azure devops with Snowflake.

There are different ways to do this. You can install and execute SnowSQL, dbt or schemachange on ADO agents. Please see this guide for an example with schemachange. https://quickstarts.snowflake.com/guide/devops_dcm_schemachange_azure_devops/index.html?index=..%2F..index#0

Related

Is it possible to execute a Data Factory Pipeline as a step in a SQL Agent Job on Azure Managed Instance?

I've currently got an ETL process that dynamically builds and executes sql jobs based on job steps that are saved in my database. Included in these jobs are steps to call SSIS packages that move data from one server to another and/or call stored procs on target servers to do further processing. I'm looking at what it would take to migrate our process from SQL Server to a Azure Managed Instance. One of the specific things I'm looking at is the feasibility of replacing the steps that call the SSIS packages with steps that execute Azure Data Factory pipelines or other ADF actions that accomplish the same results. So far I have not run across any examples of this. Anyone have any experience with accessing Data Factory functionality with SQL Agent jobs?
You can run powershell scripts via SQL agent as mentioned in below MSFT docs:
https://learn.microsoft.com/en-us/sql/powershell/run-windows-powershell-steps-in-sql-server-agent?view=sql-server-ver16
And via powershell and ADF REST APIS, you can trigger the ADF pipelines

Secure snowflake python connector

I am trying to connect to snowflake from my laptop using the python connector. The problem is that my company has strict rules on connections and transferring data.
I couldn't find anything in their documentation on securing data transfer.
Anyone has done it at a company to help me with a solution that I can offer to our IT team?
Thanks

How to do Database Change Management System in Snowflake from BitBucket? DDL Versioning for DEV/QA/UAT/PROD Environments in Snowflake from BitBucket?

Hi Everyone,
I have a current Requirement with DDL Versioning/Database Change Management System from BitBucket to Snowflake Cloud. Is there any kind of documentation to achieve this Task.
Thanks in Advance
You can use https://github.com/Snowflake-Labs/schemachange for database change management. It covers the instructions specific to Snowflake, then you'll need to separately setup the BitBucket pipeline (e.g. https://support.atlassian.com/bitbucket-cloud/docs/python-with-bitbucket-pipelines/)

Azure SQL DTU level increase and decrease with powershell schedule

We have a REPORT database in Azure.
Every morning, we need some complex sql procedures to work inside that server.
Normally DTU level of this server is 0. but only in the mornings, it needs to be updated to DTU level 3.
We are now making this manually by ourselves. But we want this to be automated via Powershell or anything.
how can we achieve this.
Can you show us the way please because we are .NET developers and don't know anything about scripting languages.
Thanks for reading and hope you can help us.
There are powershell script, azure cli and API that you can use to manage Elastic Pools.
Powershell: For more information check here
Set-AzSqlElasticPool -ResourceGroupName "ResourceGroup01" -ServerName "Server01" -ElasticPoolName "ElasticPool01" -Dtu 1000 -DatabaseDtuMax 100 -DatabaseDtuMin 20
Azure Cli: Check here for more information
az sql elastic-pool update -g mygroup -s myserver -n mypool -c
API: Check here for details.
PATCH https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Sql/servers/{serverName}/elasticPools/{elasticPoolName}?api-version=2017-10-01-preview
Single Azure SQL Database supports manual dynamic scalability, but not autoscale. For a more automatic experience, consider using elastic pools, which allow databases to share resources in a pool based on individual database needs. However, there are scripts that can help automate scalability for a single Azure SQL Database. For an example, see Use PowerShell to monitor and scale a single SQL Database.
Hope this helps.

How to update existing Azure SQL database table as data changes in data source table (OData link/api) and running everything in cloud?

I am working on my school project, I am stuck somewhere pls provide some help.
I have imported a table from an open data website into my Azure SQL database but the data changes every 30 minutes in the source. I wanted to update the recent data automatically every 30 minutes in the cloud.
Is that possible in Azure SQL database?
At this moment, I can do it through integration services package running manually. But I could not deploy it in Azure sql database and execute automatically.
Pls provide help. Thank you in advance.
I recommend to implement that using Azure WebJob with the scheduling. You can find the tutorial here. WebJob is some kind of a background service that can be executed on a schedule. Put your logic in that WebJob (get the data => put into Azure SQL) and schedule every 30 minutes.

Resources