How to retrieve "Runs Completed" count of more than one Logic App - azure-logic-apps

I have a bunch of Azure Logic Apps in my architecture and want to figure the cumulative sum of total runs completed in all the Logic Apps in my architecture. So far, I could only configure a single Logic App to the chart but I am wondering if there is a way to get a cumulative count in a single shot.
Any suggestions are much appreciated. Thanks.

For this requirement, we can use Azure Log Analytics workspace.
We need to create a Log Analytics workspace first. Please refer to this tutorial to create it.
Then enable "Log Analytics" when you create the logic app and choose the Log Analytics workspace which we created above.
After that, we can see the running log(include the cumulative sum of total runs completed) in the Log Analytics workspace. We can see them by go to the Log Analytics workspace and click "Workspace summary".
If you have already created many logic apps and do not want to create them again, you can follow the steps below:
Also create the "Log Analytics workspace" as the first solution above.
Then install Logic Apps Management solution in it.
After that, go to your logic app and set the logs to send to Log Analytics.
By the way, the logs in Log Analytics workspace will be a little bit of a delay, so please wait a moment for the logs.(In my test, I wait for more than 25 mins).

Related

getting information from specific runs of logic apps

We have logic apps running in azure.
We can query some details of past runs in azure log analytics.
Log analytics does not seem to contain any of the output from each task in the logic app, even though i can see this in the logic app history.
Is there a way to query the data/payloads/output from each task in the logic app?
Yes, you can get logs of Logic apps, I have followed below process:
Firstly, open your Log analytics workspace
In general tab open workspace summary as below:
then click on add as below:
Then click on Logic apps managemnet(preview):
The click on create:
Then give details and click on create:
Then open your created as below:
Then, I have created a workflow in logic app and then I opened Diagnostic settings as below:
Then give important details as below:
open your Log analytics workspace
In general tab open workspace summary as below:
Then you will get output as below:
When you click on chart you will get information like below:
If you click on row of LogicApp(table in above display)
Alternatively in work summary in log analytics, click on logs and then You can the below kql query to get logs and you can export it using export option as below:
AzureDiagnostics
| where ResourceProvider == "MICROSOFT.LOGIC"
| where OperationName has "workflowRunCompleted"
You can also send a email of payload (example SO-thread) to check your runs or you can create an alert on each run in logicapp.

the best offline database for mobile

I made a complete app using firebase, expo and react native (with login screen and other features). But it didn't work as I expected because I need an app that the database works offline (I want the user to use the app offline, and then connect to the internet and "update" the data).
In other words, I'm looking for another way. I heard about Realm, but it doesn't work with Expo (from what I've researched). Can anyone guide me to some possibilities??
About the app I'm making: login>selection>release
*the user logs in (offline. The first time can be online, but when entering the app again, the login must be OFFLINE). The next screen is 'selection' (all screens show information of the logged in user), the user writes some information that is saved in a database. When completing the 'selection', the next screen is 'release', in which the user will add information in the same data table (db) before (from the 'selection' screen). The user will not have internet when entering the app (but he can access the internet later and update his data as soon as he connects).
Firebase work offline and import the data when it is online.
Now about the other db that works with expo, if you like working with SQL then there is something I built expo-sqlite-wrapper it is a good ORM and did not find any better.
There is also a JSON based database easy-db-react-native, the downside of this is that you can't have too much data saved.

How do I filter and view daily logs in Google App Engine?

I have an express API application running on GAE. It is my understanding that every time someone makes a GET request to it, it creates a log. In the Logs Explorer (Operations -> Logging -> Logs Explorer), I can filter to view only GET requests from a certain source by querying:
protoPayload.method="GET"
protoPayload.referrer="https://fakewebsite.com/"
In the top-right of the Logs Explorer, I can also select a time range of 1 day to view logs from the past 24 hours.
I want to be able to see how many GET requests the app receives from a given referrer every day. Is this possible? And is there functionality to display the daily logs in a bar chart (say, to easily visualize how many logs I get every day over the period of a week)?
You can achieve this but not directly in the Cloud Logging service. You have started well: you have created a custom filter.
Then, create a sink and save the logs to BigQuery.
So now, you can perform a query to count the GET every day, and you can build a datastudio dashboard to visualise your logs.
If only the count is needed on daily basis you can create a sink to stream data directly into the BigQuery. As the data needs to be segregated on daily basis while creating a sink , a better option would be to use a partition table which can help you in two ways:
You would have a new table everyday
Based on your usage although BigQuery provide a free tier , this data is not
needed in near future storing it this way will reduce your cost and querying cost
BigQuery comes with data studio , as soon as you query on table you'll have the option to explore the result in Data studio and generate reports as needed.

Can we trigger Azure Logic App on demand?

can I trigger a Logic app on demand??
My requirement is, I am creating an application which will export data from CRM to Redis cache so I want to trigger my logic app on demand.
PS: I am new in Logic Apps
Yes, you could add Schedule Trigger to your logic app. You could refer to this doc, there are detailed description about how to set the recurrence.
Recurrence schedules support interval time,start time, on these days etc settings.
If you still have other questions, please let me know.

Run code in GAE according to changes in Firebase

Since Parse is shutting down, we are currently using Firebase to support basic data storage and real-time messaging. However, in order to implement a key feature in our app, we need to run some code on a server. The following is what we are trying to accomplish:
We allow users to upload key words to Firebase, then we want to send notifications to them if any new posts that contain these key words were uploaded by other users. For example, userA wants to know if anyone posted information related to chemistry, so userA enters key words "chemistry" and "science" in our app which get stored in Firebase, userB posted an article called "chemistry rocks!" which contains the key word "chemistry", userA will then receive a notification immediately about this post.
We have a couple of solutions in mind, but we are not sure which way to go and how to properly implement these solutions.
1 - Build a server that listens to Firebase changes and also supports sending notifications to individual users. However, to host and maintain a server just to run a search algorithm is just too much work for this simple task.
2 - Store the key words in another database that somehow can send notifications according to the search result. This would be faster because we wouldn't have to connect Firebase server to our own server, but again we would still have to host and maintain a separate server.
I have looked into Google App Engine, their push/pull queue feature sounds like something we want, but does GAE support notifications? And also how can we hook it up with Firebase? We also came across Firebase+Batch to send notifications, but I don't think Batch supports cloud computation.
Has anyone run into this problem? Any solutions?

Resources