I was wondering if anyone has sent data from Snowflake to an API (POST Request).
What is the best way to do that?
Thinking of using Snowflake to unload (COPY INTO) Azure blob storage then creating a script to send that data to an API. Or I could just use the Snowflake API directly all within a script and avoid blob storage.
Curious about what other people have done.
To send data to an API you will need to have a script running outside of Snowflake.
Then with Snowflake external functions you can trigger that script from within Snowflake - and you can send parameters to it too.
I did something similar here:
https://towardsdatascience.com/forecasts-in-snowflake-facebook-prophet-on-cloud-run-with-sql-71c6f7fdc4e3
The basic steps on that post are:
Have a script that runs Facebook Prophet inside a container that runs on Cloud Run.
Set up a Snowflake external function that calls the GCP proxy that calls that script with the parameters I need.
In your case I would look for a similar setup, with the script running within Azure.
https://docs.snowflake.com/en/sql-reference/external-functions-creating-azure.html
Related
I am new to REST API. What I basically understand from REST API is you need to call it each time to get the updated data.
In my case, I need to use the data received from REST API to generate reports in PowerBI. Adding on, I should be able to "read" the data coming from the server and "write" to the data as well.
I did find the option of getting data from WEB in PowerBI to connect directly to REST API. So, if I do that I can only "read".
Can you help me with different options on how can I do both "read/pull" and "write/push" to the server when I have its REST API? I am not sure if a cloud has to be used in-between.
I need to perform series of action after the data load is completed from snowpipe to the landing area and want to make it run on its own. Please suggest if you see any other option here if not tasks.
You can, but not directly. You may want to look at external functions.
Snowflake supports external functions which can call a cloud api and therefore run any code that you are able to expose via a supported API.
The hyperlinked article gives a walk-through of creating an external function that you can adapt to suit your needs:
This walk-through is specific to AWS, but there are options for Azure and GCP
Platforms that Support Calling an External Function
In general, an external function can be called from a Snowflake account on any cloud platform that Snowflake supports:
Amazon Web Services (AWS)
Microsoft Azure
Google Cloud Platform (GCP)
The steps you have to take to ensure appropriate security won't fit here:
however in summary you do the following:
Create an AWS IAM service role to execute all of the Lambda
functions
Create AWS Lambda functions for our examples
Create an AWS
API gateway to expose the Lambda functions to Snowflake
Create an
AWS IAM role to access the API gateway and configure it to access
both the API gateway and a Snowflake API integration object
Create
and test some external function objects in Snowflake
Sample Code
CREATE OR REPLACE external function external_functions.lambda.sum(filename varchar, rowcount number)
returns variant
api_integration = aws_lambda
as '<https://xxxxxxxxxx.execute-api.eu-west-2.amazonaws.com/snowflake-external-function-stage/snowflake-sum>'
If you decide to leverage external functions:
You may create a task, which runs a stored procedure.
This stored procedure could for read from your load history:
SELECT
*
FROM
(
select * from table(information_schema.copy_history(table_name=>'abc.mytable', start_time=> dateadd(hours, -1, current_timestamp())))
) q
WHERE q.PIPE_NAME ilike 'mypipe'
;
And use the output from the copy history to call your external function.
This external function could trigger any action that you can program on your cloud provider. This would be a "serverless" option.
If you are running on-premise / on a cloud vm. You can write a python script that runs in airflow / cron, that processes your copy history and run the next set of actions.
I would like to build a FTP service using azure logic apps/azure functions. I would like the logic app to be invoked via HTTP request (will expose this app as REST API later). The FTP server details like directory, username, password will be sent in the request.
Is there a way by which I can have my logic app to create FTP connector dynamically based on the incoming request and then do a FTP upload or download?
You cannot create a connection at Logic Apps run time, it need to happen at author time. If it's a pre-defined list of connections, you can create them first, then use switch-case to branch into the connection that should be used at run time.
I am investigating ways to move data from SQL Server into system exposed via a RESTful HTTP API.
If I were to use SSIS would I have to write a custom connector to push the data to the HTTP API after the transform step, or is there a built in feature that supports pushing to an HTTP API?
If you only want to move a very small amount of data, you could use the Web Services Task
...but note that pushing data out of SQL Server is not what this task is intended for...
The Web Service task executes a Web service method. You can use the
Web Service task for the following purposes:
Writing to a variable the values that a Web service method returns.
For example, you could obtain the highest temperature of the day from
a Web service method, and then use that value to update a variable
that is used in an expression that sets a column value.
Writing to a file the values that a Web service method returns. For
example, a list of potential customers can be written to a file and
the file then used as a data source in a package that cleans the data
before it is written to a database.
For more control, you'll want to look at using the Script Component in a data flow. Much more flexibility/control.
I am building a website (probably in Wordpress) which takes data from a number of different sources for display on various pages.
The sources:
A Twitter feed
A Flickr feed
A database on a remote server
A local database
From each source I will mainly retrieve
A short string, e.g. for Twitter, the Tweet, and from the local database the title of a blog page.
An associated image, if one exists
A link identifying the content at its source
My question is:
What is the best way to a) store the data and b) retrieve the data
My thinking is:
i) Write a script that is run every 2 or so minutes on a cron job
ii) the script retrieves data from all sources and stores it in the local database
iii) application code can then retrieve all data from the one source, the local database
This should make application code easier to manage - we only ever draw data from one source in application code - and that's the main appeal. But is it overkill for a relatively small site?
I would recommend putting the twitter feed and flickr feed in JavaScript. Both flickr and twitter have REST APIs. By putting it on the client you free up resources on your server, create less complexity, your users won't be waiting around for your server to fetch the data, and you can let twitter and flickr cache the data for you.
This assumes you know JavaScript. Once you get past JavaScript quirks, it's not a bad language. Give Jquery a try. JQuery Twitter plugin Flickery JQuery plugin. There are others, that's just the first results from Google.
As for your data on the local server and remote server, that will depend more on the data that is being fetched. I would go with whatever you can develop the fastest and gives acceptable results. If that means making a REST call from server to sever, then go for it. IF the remote server is slow to respond, I would go the AJAX REST API method.
And for the local database, you are going to have to write server side code for that, so I would do that inside the Wordpress "framework".
Hope that helps.