How to write triggered Azure alerts to the one note/Excel file via Microsoft Power Automate - azure-logic-apps

I have set up multiple Azure alerts to monitor Azure web apps performance(4xx, 5xx, Response time). When any of the alerts rule triggers, it fires up an alert to my Microsoft outlook email.
I want to write those alert details such as alert name, date, and email subject into either One Note or Excel file every time these alerts are fired so that I can keep track of these alerts.
I tried the Microsoft Power Automate tool. They have a specific template where any new email arrives; it filters the email subject and then creates issues/Tasks/Bug on Azure DevOps. I couldn't find a template that writes certain alert details to the excel file or One Note.
Is there a better way to do this?

I tried the Microsoft Power Automate tool. They have a specific
template where any new email arrives; it filters the email subject and
then creates issues/Tasks/Bug on Azure DevOps. I couldn't find a
template that writes certain alert details to the excel file or One
Note.
Based on your requirement ,In power Automate tool we don't have any predefined template to record the alert name, alert triggered time, subject of the alert from the email to the excel sheet directly.
We have written a custom work flow in power automate using outlook connector, excel for business connector ,compose function tested in our local environment which is working fine as shown below
Since the alert email body is in Html format , we have used the contentversion connector to covert the email body to text format.
Later we have used compose function to pull the alert triggered time , subject of the alert
Here are the expressions we have used in compose to pull alert triggered time , subject
For Alert triggered time :
first(split(last(split(outputs('Html_to_text_2')?['body'],'at ')),'Rule ID'))
For Subject :
split(triggerOutputs()?['body/subject'],'Severtiy:3 ')
Here is the sample output for reference :

Related

Is there a way to raise SNOW ticket as notification for query failures in snowflake?

I was going through the integration documents available for snowflake & service now. But, all documents are oddly focussed on sf consuming snow data for analytics. Didn't find anything related to creating tickets for failures at snowflake. Is it possible?
It's not about the monitoring & notification aspect of snowflake but connecting with service now and raise a ticket for query failures (tasks,sp etc.)
Any ideas?
There's no functionality like that as of now. I can recommend you open an Idea for it and if enough customers want it our Product Management will review it.
For the Snowpipe, we found a way to use it. We send the error message to SNS and then we can do a Lambda function to call the Rest API of ServiceNow to create a ticket.
For Task, we find that it is possible to use External Functions to notify to AWS whenever the Task fails, but we haven’t implemented it.
Email is a simple way. You need to determine how your ServiceNow instance is processing emails. We implemented incident creation from Azure App Insights based on emails.
In ServiceNow find the Inbound Action you need to process the email or make one.
ServiceNow provides every instance with an email account
Refer to enter link description here
The instance email is usually xxxx#service-now.com.
If your instance url is "audi.service-now.com", the email would be "audi#service-now.com".
For a PDI dev#servicenowdevelopers.com, e.g.; dev12345#servicenowdevelopers.com

Is this possible to create a private report filtering in Data Studio embeded report

I created a report in DataStudio and embedded it on my website. I activated the option "anyone with the link can view" so this report will be visible to my website users.
But I need to show my website users different data depending on their user ids and more important I don't want users would be able to see other users' data so if I used URL filtering users would be able to breach and search another user id to see his data.
Does anyone have a solution for this scenario?
In Google documentation I saw an option to limit the report to users in my domain, I assume this will solve this issue, but I don't find how to restrict other domains.
Users are logged onto Google
If users of your website are already logged onto Google, use the Filter by email address guide from Data Studio help center. This requires you to setup FILTER BY EMAIL and then have a field in your data can be directly used as an email filter.
Users are not logged on to Google
If you want a solution where the users don't have to be logged onto Google, you will need to:
Create a Community Connector to pass the filtered data to your users. The connector should accept a short lived token as part of the config.
Create a dashboard with your connector and pass unique short-lived tokens for each user.
You should have an endpoint that returns the current user's data based on the token provided. Alternatively, the endpoint can return only the user's identify and you can query a secondary data source with a service account filtering for the user's identity.
Your connector should call your endpoint to fetch data only for the user/for the user's identity.
This official guide demonstrates how to implement this in more details.
Disclaimer: I work in the Data Studio team and wrote the above guide.
First option is to add extra 2 fields to your data source.
User_ID
Password
For example:
Data, User_ID, Password
$10,Daniel,123
$20,Alex,456
In your dashboard, you need to create two parameters:
User_ID_Parameter
Password_Parameter
Both parameters can set the default value to null, and accepts any values.
Then create a new calculated field:
CASE
WHEN REGEXP_MATCH(User_ID,USER_ID_Parameter) AND REGEXP_MATCH(Password,Password_Parameter) THEN 1
ELSE 0
END
Then create a new filter to the chart that you want to hide:
To include the above calculated field Equal to 1
Second option is to use the Data Studio default Row Level Security
The only caveat is the users need to sign in before they can view the report.

Row level filtering with Data Studio and BigQuery - Sharing the dashboard with external users

In order to create external users data-studio dashboard using row level premissions method I followed the instructions here: https://developers.google.com/datastudio/solution/row-level-filtering.
Everything is working fine ,until I got to the sharing part:
Make the dashboard available to users
In order to share the dashboard with external users I need to share the connector script.
Is there a better & safer way to share the dashboard with exteranl users using row level premission method ?
The ability to Filter by email address (Provide row-level security to the data for signed-in users) was released today (13 Feb 2020).
Quoting the steps from the support page:
Create an email filter
1. Edit your data source.
2. In the upper left, click FILTER BY EMAIL.
3. Turn on Filter data by viewer email.
4. Select the field in your data source that contains viewer email addresses.
5. To return to the data source editor, click ALL FIELDS.

Writeback to Salesforce using Powerforms

I am using Powerforms as an approach to get signatures using the DocuSign. What I want is to send some part of the data back to Salesforce from the PDF while the user signs it. I created custom fields on the PDF, related them to Salesforce, checked the writeback and allow sender to edit boxes but the data hasn't got back to SF yet. Any help?
When you send from Salesforce, there is information passed to document (such as the SourceID and Source Object Type), that DocuSign Connect will attempt to match to a record to process a request to push the data back into Salesforce.
If you are using a PowerForm, you would have to pass along similiar data and setup DocuSign Connect correctly for it to relate the data back to a Salesforce record.
This is something that is doable with customization, but not functional out of the box.

Export CSV as portal user

I have a salesforce I export to CSV using a URL like this https://tapp0.salesforce.com/00OT00000014APi?export=1&enc=UTF-8&xf=csv following this blog post.
This works fine and is very fast when I run for a fully licenced user.
However when I try to call the same report export as a Gold Patner portal user I get an error "Insufficient Privileges"
I have marked the report as deployed.
Given all users access to the Report Folder
The user does have the correct sharing and profile rules setup to view the data in the report.
Going to just the report URL by itself works https://tapp0.salesforce.com/00OT00000014APi
It only fails when I try to export to CSV.
I DO reliase I am using an unsuported internal API call. But was wondering is there anyway portal users can export reports to CSV?
Not sure about through the UI, but this can be done through a SOQL query and portal users do have limited access to the API. Most, but not all, reports can be converted to SOQL queries to produce the same output.
For running your query, the easiest is probably to build a Visualforce page that is backed by an Apex controller that runs the query and outputs a CSV. Take a look at the contentType attribute on the apex:page tag. You can set things like application/vnd.ms-excel#contacts.xls to automatically export a data table to Excel. I didn't try, but it probably works with CSV too -- worst case is that you open in Excel first and save as CSV.
Also, if you don't mind portal users having to leave Salesforce to get their CSV, you might want to try Workbench, which is an app I built that allows for portal user login and helps you build the SOQL query for CSV export both through the SOAP and Bulk APIs.
Ok I found the problem
You need to go to Salesforce > Setup > Manage Users > Profiles
Then Click to Edit the RS_PortalUser profile
Click the checkbox next to “Run Reports” and “Export Reports”
Click save

Resources