Azure Logic App Step Fail Monitoring with Alerts sent - azure-logic-apps

We have a logic app little complex and it has a step in whcih we Create a File and Upload it, this step can fail sometimes.
What we would like to do is Monitor this step over a period of 3 hour and if this step fails like more than 10 times in that 3 hour period we would like to send an alert to a team.
Unfortunately I have not been able to find out a way to perform this. One thing I thought was every failure of this step should be recorded in Log Analytics workspace and may be we can query that workspace for such errors and then generate alert from it.
Anyone aware of how we can monitor a STEP in Azure Logic Apps for failure over a 3 hour period and then generate an alert? All this has to be done with out of the box Azure and not looking for any third party paid solutions.
Thank you,

Is this what you are looking for? action groups?
Some of the possible rules, such as Actions Failed could work on a greater than errors per hour count.
Otherwise, would it be possible to call on the logic app from the logic app where if the step fails, call upon it with the file that failed to try to redo the operation? Like an 'unsuccessful run after action'?
I guess you could also on an unsuccessful action post to a slack channel.

ty but this option we had already explored and doesnt help for our requirements. We were looking for Monitoring soln and finally we got this query and an Alert out of it to make it work for our requirement:-
AzureDiagnostics
| where
status_s == "Failed"and
tags_displayName_s =="My Logic Apps Name"and
resource_actionName_s in ("For_each_file_in_blob_copy_it_to_sftp_server","Create_file_on_SFMC_SFTP_folder")
| summarizecount()by resource_runId_s, resource_actionName_s, tags_displayName_s, status_s
Using the above Query we created a New Alert and this currently works for our monitoring requirement.

Related

Send Dynamic data to Azure Logic app based on fixed schedule

I've a logic app, let's name it as 'LA1' having HTTP trigger. This logic app can accept multiple request types (see Request 1 and Request 2 below) and can call respective nested logic apps based on request fields -
Request 1 -
{
"Format":"F1",
"Time":"T1"
}
Request 2 -
{
"Format":"F2",
"Time":"T2"
}
Now I wanted the above requests to be sent to LA1 on specific time intervals. Say Request 1 to be sent to LA1 every 1 minute and Request 2 to be sent to LA1 every 2 minutes. This was accomplished successfully using Scheduler Job Collections in Azure portal where I'd create couple of schedulers to run every 1 or 2 minutes and configure Request 1 and Request 2 in them.
Now that as Microsoft has retired Scheduler Job Collections I would like to know different alternate options in hand to send dynamic data (scheduled at specific intervals) to LA1 Logic app.
I understand that creating multiple logic apps with recurrence trigger and passing different JSON is one option however I would like to avoid the same as I would end up creating too many logic apps and in case of any changes this would need a deployment of Logic apps on every environment.
I would like to have something that's configurable (one time configuration on every environment) something that was catered perfectly by Scheduler Job Collections. Any thoughts/ideas are much appreciated!
Thanks!
One choice is Azure Function, it has Timer Trigger binding, it uses CRON expressions to define your schedule. Further more information you could refer to this doc: Timer trigger for Azure Functions.
Also Azure provides Azure Automation and the Azure Automation supports schedule a runbook. You could use powershell to manage schedules. Further more details you could check this doc: Scheduling a runbook in Azure Automation.

Load testing a Google App Engine Application using JMeter

I've created an application and I'd like to test how well it scales to large numbers of users.
To run my application a user has to go to the homepage, sign in to a Google account, click a button and then upload a video file.
First of all, is this possible to emulate using JMeter? I'm signed into my Google account locally but am not sure whether simulated users will have access to it?
Secondly, I've recorded a session in JMeter doing the actions above and have run the test with 10 simulated users, however, the App Engine dashboard doesn't detect any activity. I've followed the steps mentioned here but obviously with details of my application etc.
Here's a screenshot of the summary report.
Is there anything obvious I might be doing wrong? Am I using JMeter in the correct way to test the application as desired?
Apologies for my JMeter inexperience.
This is not something you will be able to record and replay, my expectation is that your application is protected by OAuth so you will need some token in order to execute your calls.
Not knowing the details of your application implementation it's quite hard to guess what's went wrong, I would recommend
Running your test with 1 user and 1 loop first to ensure that it's doing what it is supposed to be doing by adding View Results Tree listener and inspecting request and response details for each sampler (especially for failed ones).
Once you figure out what's wrong with this particular request - amend JMeter configuration so it would be successful. Repeat until you're happy with the test end-to-end.
Add load only after that and be careful as test might be sensitive to extra users/loops, especially if you're using a single login account (which is not recommended)
References:
How to Handle Correlation in JMeter
How to Run Performance Tests on OAuth Secured Apps with JMeter

Getting server logout when running long batch jobs in Seam

One of the requirements I have is to generate flat files in a specific format. The user selects the year from the UI and clicks the generate button.
The flat files process usually takes 3 to 4 hours to generate all the files. When the process is running and flat files are being created, the UI shows a modal that the job is being processed.
The problem is that after the files are successfully generated, the UI redirects to the login screen. Instead I want to refresh the UI showing the message that the process has successfully completed.
I am looking for help on this. Also would increasing the conversation timeout or session timeout in web.xml help fix this issue?
yes you could increase both session timeout and conversation timeout (if doing work in conversation scope) so they exceed the duration of the job
a better solution may be to store information on the jobs in a higher scope (eg. application or to the database), then if the user accidently logs out the job will continue running and complete

Best approach for real time process information / Server + JS Client

I have a C# Web API project on server side and on front-end I have ExtJS 4.2.1 (Javascript framework client).
There is a section in my app where I request to start a long running process (about 5 minutes) and I want to show the user the status of the process being executed.
Basically, the process will run a special calculation for every employee in the database (about 800), so I want to let the user know which Employee is being processed in that moment.
So I was thinking in two ways of doing this, and maybe I don't know if having both is ok.
Use SignalR to show the information of the process in Real Time.
Write to a database table all the process log (every employee that its being processed).
If I use the first approach, if the user close the browser he will loose all the information about the process and if he log into the app again he will only see the actual status.
If I use the second approach, if he log into the app again he could see all the information, and using maybe a timer on client side the data could be refreshed every 5 seconds.
Does anyone have implemented something like this? Any advice is appreciated.
You should use a combination of the two. When you have calculated a employee save the state to the database and publish the change on a service bus.
Let SignalR pick these messages up and forward them to the client. This way the user will see old state when he connects and new state then they arrive with SignalR. I have created a Event aggregator proxy that makes this very easy.
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/wiki
Follow the wiki to set it up, here is a demo project
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/tree/master/SignalR.EventAggregatorProxy.Demo.MVC4
Live demo
http://malmgrens.org/Signalr/

Google App Engine - Cron or Task Queue?

I'm building a simple "play against a random opponent" back-end using Goole App Engine. So far I'm adding each user that wants to play into a "table" in the Datastore. As soon as there are more than 1 player in the Datastore, I can start to match them.
The Schedule Tasks with Cron looked promising for this work until I saw that the lowest resolution seems to be minutely. If there are plenty of players signing up I want them to be matched quickly and not have to wait a whole minute (worst case).
I thought about having the servlet that recives the "play against random opponent" request POST to a Task Queue that would do the match making, but I think this will lead to a lot of contention when reading from the Datastore and deleting the enteties from the "random" table after they have been matched?
Basically I want a single worker that will do the matching, and I want to signal this worker from time to time that now is a good time to try to match opponents.
Any suggestions on what would be the right course of action here?
You can guarantee exclusive access via transactions:
Receive a request to play via REST. Check (within a transaction) if there is any request in database.
If there is, notify both users to start the play and delete request (transactionaly) from database.
If there isn't, add it to the database and wait for the next request.
Update:
Alternativelly you can achieve what you need via pull queue. Same scenario as above, just instead of datastore you'd check if there is a task in the pull queue, retrieve if there is or create a new one if there isn't one.

Resources