We have an External function defined in Snowflake. I want to call that function in my JavaScript UDF function. Is it possible to do so?
Thanks
Currently, you can't call any other UDF including external functions from a JavaScript UDF. You can call them from SQL UDFs and JavaScript stored procedures by executing them in a SQL statement.
You can also nest UDF calls, like select function_1(function_2(my_value));
Related
I have a Scalar-Valued function stored in a database on Azure SQL Managed Instance. When I run queries/stored-procs on the database, I can obviously directly run this function.
But now I need to move my SQL loads to Azure Data Factory data flows. I could not find a way to call the user-defined functions from the ADF data flow. I thought I could use SELECT transformation to call this function the same way we do it on the database query but looks like that can't be done. By any chance, would anybody have an idea about how to call this function from the data flow?
You can call the User-defined function from dataflow source transformation.
I have a scalar user defined function created on Azure SQL.
In Azure Data factory dataflow, connect the source to Azure SQL database and in Source options, select Input as Query and call the function by using Select to get the data.
Our current migration project from Azure Sql to Snowflake is using ADF V2 as orchestration tool. We need to call snowflake procedures through ADF pipeline which is parametrized (Ex: Pipeline has an array as parameter with list of tables and statements to be passed to ADF activities within the pipeline). Since ADF stored procedure activity is not supporting Snowflake procedure call, we have a work around to use Azure Function to call Snowflake sql statements and we were able create one and used that in ADF pipeline to call procedure. This procedure has to be re-used dynamically by accepting table name from pipeline parameter which is an array containing all table names along with other fields.
But, we were facing difficulty to figure out passing ADF pipeline array parameters to Azure function procedure call, not sure this is the limitation of azure function in ADF V2.
Let's say if we have a parameter called "ListTables" then We can use below syntax expression to take value from your pipeline parameter.
#{pipeline().parameters.ListTables}
If you are using Azure Function which will be triggered using HTTP request, then inside your request body you many need to pass above parameter with same mentioned syntax. Thank you.
In below example, I am trying call Azure Function which is triggered using HTTP request and I am passing my "ListTables" array in to request body.
I am working on a task converting netezza to snowflake. So i need a solution to convert netezza create function to snowflake create function.
Netezza user defined functions are implemented in C++ and Snowflake user defined functions are in either SQL or Javascript, so a conversion isn't possible. You will have to reimplement the function in Javascript or SQL.
I have few snowflake stored procedure and i need to call them from Datastage tool?What is the procedure we can do this?
You can call a Snowflake procedure by using CALL:
CALL myProcedure();
Docs: https://docs.snowflake.com/en/sql-reference/sql/call.html
Regarding Datastage: You can send the CALL-statement the same way you send other SQL-queries to Snowflake.
I am just wondering which will be faster t-sql function/procedure or clr version of one? A procedure works with database data and use cursors (t-sql version).
When should I use clr and when I should use t-sql to create procedures and functions?
Simple rule-of-thumb:
data manipulation (SELECT, UPDATE etc.) are best left to T-SQL (but without cursors!)
while anything that has to do with processing (string/regex matching, date arithmetic, calling external web services etc.) is a good match for SQL-CLR