Azure Logics-apps(preview) trigger history - azure-logic-apps

Greeting for the day
I am using Logicapps(preview) version and I want save my workflow's trigger history in a database table.
So, is there any way to do that?

1. You can use this Rest API to get a list of workflow trigger histories, and then store them to Table.
https://learn.microsoft.com/en-us/rest/api/logic/workflowtriggerhistories/list
2. You can enable Diagnostic settings
select Diagnostic settings > Add diagnostic setting.
Send the running log of Azure logic app to storage.
Wait for about twenty minutes, you can get the corresponding log information in the container.
Finally, you can get the corresponding information from the container and save it to the database table.

Related

What is the most suitable way to save user's actions in react webpage (SPA)?

I'm building a SPA site in React (using redux).
To my site, any user can connect through Google or Facebook.
Each user who logs in to the site receives a personal user_id.
For each user, the system needs to keep a history of documents created by this same user (like the recent docs in Word).
I need to create functionality that whenever the user is logged in he will be able to see a history of the five documents he has created/updated.
In addition, the latest documents will load even after disconnecting and reconnecting to the system.
To load the history into the system I am thinking of using a dedicated index in ElasticSearch.
My question is which way would be suitable the most to use when the user is already logged in and creates several documents one after the other -
Should I need to save everything within the index in ES or is there a smart way to save and update the information locally without producing a lot of calls to DB?
I want that in the end there will be only 2 DB calls that are made in total - one call to load the information on login and one call to update the information when the user logs out. Any other create and update docs will save locally on the client side until leaving the site.

How do I filter and view daily logs in Google App Engine?

I have an express API application running on GAE. It is my understanding that every time someone makes a GET request to it, it creates a log. In the Logs Explorer (Operations -> Logging -> Logs Explorer), I can filter to view only GET requests from a certain source by querying:
protoPayload.method="GET"
protoPayload.referrer="https://fakewebsite.com/"
In the top-right of the Logs Explorer, I can also select a time range of 1 day to view logs from the past 24 hours.
I want to be able to see how many GET requests the app receives from a given referrer every day. Is this possible? And is there functionality to display the daily logs in a bar chart (say, to easily visualize how many logs I get every day over the period of a week)?
You can achieve this but not directly in the Cloud Logging service. You have started well: you have created a custom filter.
Then, create a sink and save the logs to BigQuery.
So now, you can perform a query to count the GET every day, and you can build a datastudio dashboard to visualise your logs.
If only the count is needed on daily basis you can create a sink to stream data directly into the BigQuery. As the data needs to be segregated on daily basis while creating a sink , a better option would be to use a partition table which can help you in two ways:
You would have a new table everyday
Based on your usage although BigQuery provide a free tier , this data is not
needed in near future storing it this way will reduce your cost and querying cost
BigQuery comes with data studio , as soon as you query on table you'll have the option to explore the result in Data studio and generate reports as needed.

Azure AD Enterprise application role change doesn't trigger provisioning update for Zoom or DocuSign

I've set up Zoom and DocuSign with SSO and Automatic provisioning in Azure AD Enterprise Applications. Just in Time provisioning works as expected after ensuring roles are correctly mapped. Automatic provisioning however only appears to add users the first time it runs. If I add an application user, change a user's Application Role, or Remove the user from the application nothing happens on the next provisioning run. I would expect the user to be added, the user's permissions to be updated at Zoom or DocuSign, or for the user to be disabled.
Documentation seems to show that updates and deletes should be handled through provisioning. What am I missing?
Second question is whether the timing of how often provisioning job runs can be changed. It is time consuming to test when I have to wait 40 minutes between tests.
updates and deletes are handled if they are configured to. https://learn.microsoft.com/en-us/azure/active-directory/app-provisioning/configure-automatic-user-provisioning-portal#configuring-automatic-user-account-provisioning
in the screenshot, you'll see there is actions that it can target, create, update, delete. if those are all selected, As a test, you should make sure the target actions are selected and try to change a different attribute, say add some characters to a name or something. it should trigger and update to the provider.
I believe changes of the User itself will trigger the provisioning changes,
The issue here is likely because app roles are specific to applications, they are not user or group attributes, nothing has actually changed on that user object. so it wouldn't detect any changes.
as per: https://learn.microsoft.com/en-us/azure/active-directory/app-provisioning/how-provisioning-works#incremental-cycles
it will "Query the source system for any users and groups that were updated since the last watermark was stored."
if I take that literally, then changing the app role isn't a change to a user or a group, so it won't trigger a delta sync change
Deletes however should occur, if you unassign the user from the application. as per here: https://learn.microsoft.com/en-us/azure/active-directory/app-provisioning/how-provisioning-works#de-provisioning
as for your second question, the interval I don't believe you can change that for the incremental schedules.

Keeping a data table in session for current user

I have a C# .net 4.5 website where users select certain data fields and then they get generated and the user can download the data.
There is a new feature management wants me to add that will allow users to select any field even if there are fields that do not go together and when the user submits this job I have to split out the fields and generate however many jobs it takes to create them.
Without having to change my entire back end process I wanted to store the users selections in a data table in memory and when they submit I can loop through the table and submit the jobs accordingly.
What would be the best way to have a data table that will be alive during the entire user session? Should I create it as a session variable? The user can come back and add or remove from it at anytime while on the site.
Thank you

Send Email When Match in Firebase Database

I'm trying to build a web app using AngularJS and Firebase that sends an email to two users that match on certain parameters. The users submit their information first and if there is a match with another person in the database, I want to send an email to both those people. For example, if two people, A and B both have the age of 25, I would like to send A and B an email with certain information. Is this possible using Firebase?
If you are NOT running a server
Yes
You could achieve it by using a service like Zapier.
You could create a zap linking FireBase and an e-mail service like Mandrill.
A Zap combines triggers and actions — whenever the trigger event occurs, Zapier automatically completes the action for you!
When there is a match, update a special key in your FireBase database.
As Zapier is listening to updates of that particular key, it will react by sending your e-mails via Mandrill.
March 2017 UPDATE
New tools have now been added to Firebase to trigger database event handlers:
database events triggers: https://firebase.google.com/docs/functions/database-events
cloud functions for FireBase: https://firebase.google.com/docs/functions/functions-and-firebase
using cloud functions to send e-mails through sendGrid:
https://cloud.google.com/functions/docs/tutorials/sendgrid
If you are running a server
Your server can easily check the values in FireBase and send e-mails accordingly. That would then be a
Yes, of course :)

Resources