How to format body to pass input dataset as parameter in Azure ML? - azure-logic-apps

I'm trying to consume my Azure ML Pipeline (Batch) from LogicApps. For that, I've deployed the Batch pipeline with the dataset as parameter :1
But I can't figure it out, how to format my body to invoke the pipeline and I don't find documentation on that.
For now, this is how my logicapp looks like:
2
and I get a 415 Http Error code when trying to invoke the pipeline.
Thanks for your help.

Related

How to format powershell output in Logic App

I am trying to fetch expiry date of all the webhooks using an runbook and then sending that output in an email using logic app.
Below is the script to get webhook expiration date which will be stored in an hashtable
Powershell Script
This is the powershell output on the screen
Powershell Output
But this is the format in whic i am getting the email
Email Notification
Can anyone explain how to get email notification in table format ?
I am using get job output connector to get the runbook output and i select (Content) to get the output from previous step
Logic App Email Connector
You are receiving this because the output from runbooks is read in a paragraph tag inside the mail connector. After reproducing from my end, I could get the answer usingpre tag which defines preformatted text. Alternatively, you can send the output as attachment to the mail which gives you output in the required format. Below is the flow of my logic app.
Results in mail:

Ways to Send Snowflake Data to a REST API (POST)

I was wondering if anyone has sent data from Snowflake to an API (POST Request).
What is the best way to do that?
Thinking of using Snowflake to unload (COPY INTO) Azure blob storage then creating a script to send that data to an API. Or I could just use the Snowflake API directly all within a script and avoid blob storage.
Curious about what other people have done.
To send data to an API you will need to have a script running outside of Snowflake.
Then with Snowflake external functions you can trigger that script from within Snowflake - and you can send parameters to it too.
I did something similar here:
https://towardsdatascience.com/forecasts-in-snowflake-facebook-prophet-on-cloud-run-with-sql-71c6f7fdc4e3
The basic steps on that post are:
Have a script that runs Facebook Prophet inside a container that runs on Cloud Run.
Set up a Snowflake external function that calls the GCP proxy that calls that script with the parameters I need.
In your case I would look for a similar setup, with the script running within Azure.
https://docs.snowflake.com/en/sql-reference/external-functions-creating-azure.html

Get all-time-impressions for given contentName in Matomo Reporting API

I'd like to request the core metrics for a given contentName in matomos reporting API. ContentName is in this example anwalt:4247, and I send this request:
https://statistics/?method=Contents.getContentNames&segment=contentName==anwalt:4247&label=anwalt:4247&date=2019-01-01,today&period=range&format=JSON&module=API&idSite=1&format=JSON&token_auth=93exx3
gives
[{"label":"anwalt:4247","nb_visits":27,"nb_impressions":37,"nb_interactions":12,"sum_daily_nb_uniq_visitors":27,"interaction_rate":"32,43\u00a0%","segment":"contentName==anwalt%3A4247","idsubdatatable":1}]
or this
https://statistics/?method=Contents.getContentNames&label=anwalt:4247&date=2019-01-01,today&period=range&format=JSON&module=API&idSite=1&format=JSON&token_auth=93exx3
gives
[{"label":"anwalt:4247","nb_visits":21,"nb_impressions":28,"nb_interactions":8,"sum_daily_nb_uniq_visitors":21,"interaction_rate":"28,57\u00a0%","segment":"contentName==anwalt%3A4247","idsubdatatable":282}]
But both numbers are wrong (other than in matomo UI).
Isn't there any simple request for that common task?
What you tried with &date=2019-01-01,today&period=range should work fine, what is the problem in the output data?

Send cluster or array of data generated by a machine programmed in LabVIEW, to Firebase

I've already tried to connect LabVIEW to Firebase, and it needs a json file. I added a knob to control the input and meter as an output. When I change the knob, let's say, to 5 for example, the meter also changes to 5.
Now the machine code is actually made, and it gives an array for the results of tests, and I want to send that array to Firebase.
How can I accomplish this?
you can use the api rest to your real time database in firebase
api rest Firebase
then your labview code can use http post and http get to save and retrieve data.
the url have the following sintax
https://proyectID.firebaseio.com/Datacollection/somename.json
the somename.json does not mean a file, just mean the tag to be saved in your database, to encode any labview data use the function "Flatten to Json"
sample code

Postman Chrome: how to use variables from an array for the URL

Sorry to bother you, but I really would use some help on the below:
I am using Postman Chrome extension against an Rest API.
My URI is like the following and the request type is PUT:
PUT https://{{api-fqdn}}/some/path/to/something/ROLE111
I have about 100 roles with different names.
Instead of passing each time different role name at the end of the URI, how could I use Postman to scan an array and replace all the values inside using a variable, such as {{rolename}}?
Thanks a ton in advance!
Take a look at using the native Postman app with the Collection Runner. You can create a data file (either JSON or CSV) with your different role names, The runner will replace that variable placeholder with the data from the file.
More info can be found here: https://learning.postman.com/docs/postman/collection-runs/working-with-data-files/

Resources