Azure Data Factory - Data Flow - Run SQL User-Defined Function - sql-server

I have a Scalar-Valued function stored in a database on Azure SQL Managed Instance. When I run queries/stored-procs on the database, I can obviously directly run this function.
But now I need to move my SQL loads to Azure Data Factory data flows. I could not find a way to call the user-defined functions from the ADF data flow. I thought I could use SELECT transformation to call this function the same way we do it on the database query but looks like that can't be done. By any chance, would anybody have an idea about how to call this function from the data flow?

You can call the User-defined function from dataflow source transformation.
I have a scalar user defined function created on Azure SQL.
In Azure Data factory dataflow, connect the source to Azure SQL database and in Source options, select Input as Query and call the function by using Select to get the data.

Related

Azure Functions vs Stored Procedures for Sequential Dependent Calculation Scenario

I have a scenario where I need to read the data from SQL Server Database (Azure) and perform calculations and save the calculated data back to SQL Server Database.
Here, I'm using the Timer Trigger Function so that I can schedule one after another as calculations are dependent on each other (a totally of 10 calculations running one after another)
The same can be achieved via Stored Procedures in an easy way as they reside in the backend. I want to understand which is the better way to perform/handle such a scenario? in terms of Performance, Scalability, Debugging Capabilities, Cost, etc.
If you are using SQL Server then definitely SQL Procedure is the right approach because of it's compatibility with SQL Server.
Another recommended approach is use data flow activity in Azure Data Factory and transform the data using the available functions. This is easy to use methods as you will get all the required transformation functions in-built.
You can also run Store Procedure in Azure data factory using Stored Procedure activity.
Refer: Create Azure Data Factory data flows

Azure ADF V2: Passing pipeline parameter of type array to Azure function which triggers Snowflake procedure

Our current migration project from Azure Sql to Snowflake is using ADF V2 as orchestration tool. We need to call snowflake procedures through ADF pipeline which is parametrized (Ex: Pipeline has an array as parameter with list of tables and statements to be passed to ADF activities within the pipeline). Since ADF stored procedure activity is not supporting Snowflake procedure call, we have a work around to use Azure Function to call Snowflake sql statements and we were able create one and used that in ADF pipeline to call procedure. This procedure has to be re-used dynamically by accepting table name from pipeline parameter which is an array containing all table names along with other fields.
But, we were facing difficulty to figure out passing ADF pipeline array parameters to Azure function procedure call, not sure this is the limitation of azure function in ADF V2.
Let's say if we have a parameter called "ListTables" then We can use below syntax expression to take value from your pipeline parameter.
#{pipeline().parameters.ListTables}
If you are using Azure Function which will be triggered using HTTP request, then inside your request body you many need to pass above parameter with same mentioned syntax. Thank you.
In below example, I am trying call Azure Function which is triggered using HTTP request and I am passing my "ListTables" array in to request body.

Syncing Data with Azure Data Factory

Is it possible to set up a pipeline in Azure Data Factory that performs a MERGE between the source and the destination rather than an INSERT? I have been able to successfully select data from my source on-prem table and insert into my destination, but I'd really like to set up a pipeline that will continually update my destination with any changes in the source. E.g copying new records that are added to the source, or updating any data which changes on an existing record.
I've seen references to the Data Sync framework, but from what I can tell that is only supported in the legacy portal. My V12 databases do not even show up in the class Azure portal.
There is the Stored Proc activity which could handle this. You could use Data Factory to land the data in a staging table then call the stored proc to perform the MERGE. Otherwise Data Factory logic is not that sophisticated so you could not perform a merge in the same way you could in SSIS for example. Custom activities are probably not suitable for this, IMHO. This is also in line with Data Factory being ELT rather than ETL.

Create PDF automatically after new entry in database

I am not sure if this is even the right site for this but I'll ask anyways.
Does anybody know a program that can create a PDF with data from a database with a complex SQL statement? When an employee finishes a request for a customer, I want that the program is triggered by the new entry in a database table and fills out a pre built PDF with data that it pulls from a database.
It needs to be a complex program that can process big SQL statements.
The only thing you can do to run custom code in SQL Server is to create a CRL Stored Procedure and a trigger to start PDF processing in your specific use case. You can write a class library that connects to the database you are triggered from using a specific keyword in the connection string. You can pass in the key you need to use to select the whole dataset you need to fill the PDF as a SP parameter.
https://msdn.microsoft.com/en-us/library/ms131094.aspx
In the class library you can reference third part libraries to interact with an editable PDF and fill the flieds you need with the data retrieved from the database. I suggest you to have a look about security concerns releted to CLR use in SQL Server. Basically your code runs within the SQLServer.exe process, sharing resources and access privileges.
https://msdn.microsoft.com/en-us/library/ms131071.aspx
SSRS can produce PDF. See if that table insert trigger can call a CLR that in turn will execute the report that can even be attached to an email.
The report can be as complex as you need, so long the report SP can populate the data based on the newly inserted row.
For stability, let's not have a trigger, but the insert process itself should be done in an SP, in which that insertion SP shall call the CLR.
Further, instead of using CLR, we can use SSRS Data-Driven Subscription. When making the Subscription, we can make it as a one time scheduled Job. The SP can invoke this 'expired' Job from SQL Server Agent by using sp_start_job.

Can I call an SQL Server's user defined function from Microsoft Access

I'm trying to upsize my Access application to an Access FE and a SQL Server database BE.
One of the problem I have is that queries with "filtering parameters" are executed client-side and require all rows to be sent from the server to the client.
example:
SELECT * FROM MyTable WHERE MyId = Forms!MyForm!MyControl.Value;
This query will require all the rows from MyTable to be sent from the SQL Server to Access that will eventually execute the WHERE clause.
I've read about SQL Server's User Defined Function and it looks like it could work for me if only I could call them from Access the same way I can do in a SQL Server Query.
Can I do this?
Is MyID indexed? If so, then Access shouldn't be dragging the entire table across from the server.
I'm not sure where you're getting the criteria from, though, as that's not the SQL that a form filter is going to send. Or even a saved QueryDef with a hardwired reference to a control on a form.
Try dropping .Value (it's redundant as it's the default property).
Also, if it's a saved QueryDef, try defining the control reference as a parameter, i.e., PARAMETERS Forms!MyForm!MyControl Long;.
Basically, nothing that you report is standard Access behavior with ODBC linked tables to SQL Server. If it were, Access would be hell to upsize, and it's not at all.
Should be possible with a "pass-through" query.
I agree, a pass-through query will do it, but that will leave a persistent query object in your application window.
Using ADO recordset and connection objects will allow you to return filtered results into your recordset, but you'll need to write TSQL statements, which differ from Access SQL.

Resources