I wonder that do we have a way to make a logic app to run in the specific date in the future and no repeat, I use Recurrence trigger but it required an interval.
Thank you all.
Following is the Microsoft documentation to run a logic app in the future only once and not repeat the run.
https://learn.microsoft.com/en-us/azure/logic-apps/concepts-schedule-automated-recurring-tasks-workflows#run-one-time-only
It uses the Delay-Until action in Azure logic Apps to achieve it.
https://learn.microsoft.com/en-us/azure/connectors/connectors-native-delay#add-the-delay-until-action
Related
I am currently creating a react native + expo application upon which essentially each page makes an API call, which is a lot of API calls. I have this app also connected to firebase for different information. The things is, each of these pages don't update more than once or twice a day for the most part, so I really don't want the End User to be calling the API that much either.
My question is, is there a way to write and host a script that will continuously run that knows to call this API once every hour (or so) and then rewrite to the firebase db from which I can then only need to pull from the database as compared to having each user individually making dozens of API calls.
Please let me know! I have spent days on google and am no closer than I was before. I'm also willing to change my set up from firebase if it is not possible to accomplish that way. Thanks!
You can use a Cloud Functions scheduled trigger to run code periodically that can make changes to your database.
We have a logic app little complex and it has a step in whcih we Create a File and Upload it, this step can fail sometimes.
What we would like to do is Monitor this step over a period of 3 hour and if this step fails like more than 10 times in that 3 hour period we would like to send an alert to a team.
Unfortunately I have not been able to find out a way to perform this. One thing I thought was every failure of this step should be recorded in Log Analytics workspace and may be we can query that workspace for such errors and then generate alert from it.
Anyone aware of how we can monitor a STEP in Azure Logic Apps for failure over a 3 hour period and then generate an alert? All this has to be done with out of the box Azure and not looking for any third party paid solutions.
Thank you,
Is this what you are looking for? action groups?
Some of the possible rules, such as Actions Failed could work on a greater than errors per hour count.
Otherwise, would it be possible to call on the logic app from the logic app where if the step fails, call upon it with the file that failed to try to redo the operation? Like an 'unsuccessful run after action'?
I guess you could also on an unsuccessful action post to a slack channel.
ty but this option we had already explored and doesnt help for our requirements. We were looking for Monitoring soln and finally we got this query and an Alert out of it to make it work for our requirement:-
AzureDiagnostics
| where
status_s == "Failed"and
tags_displayName_s =="My Logic Apps Name"and
resource_actionName_s in ("For_each_file_in_blob_copy_it_to_sftp_server","Create_file_on_SFMC_SFTP_folder")
| summarizecount()by resource_runId_s, resource_actionName_s, tags_displayName_s, status_s
Using the above Query we created a New Alert and this currently works for our monitoring requirement.
I'm trying to call a SuccessFactors API to update some data from a Logic App.
But I keep running into an "Unauthorized" error.
How can I get some more details about this error? Can't see input-output for this action so it's a bit difficult.
Kind Regards
Tim
I ended up trying to mimic the call in an online REST test tool. That gave me the error I was looking for.
SuccessFactors has some settings on user level to only allow logins for certain ip's. If I add the logic app IP's, it works.
I've created an application and I'd like to test how well it scales to large numbers of users.
To run my application a user has to go to the homepage, sign in to a Google account, click a button and then upload a video file.
First of all, is this possible to emulate using JMeter? I'm signed into my Google account locally but am not sure whether simulated users will have access to it?
Secondly, I've recorded a session in JMeter doing the actions above and have run the test with 10 simulated users, however, the App Engine dashboard doesn't detect any activity. I've followed the steps mentioned here but obviously with details of my application etc.
Here's a screenshot of the summary report.
Is there anything obvious I might be doing wrong? Am I using JMeter in the correct way to test the application as desired?
Apologies for my JMeter inexperience.
This is not something you will be able to record and replay, my expectation is that your application is protected by OAuth so you will need some token in order to execute your calls.
Not knowing the details of your application implementation it's quite hard to guess what's went wrong, I would recommend
Running your test with 1 user and 1 loop first to ensure that it's doing what it is supposed to be doing by adding View Results Tree listener and inspecting request and response details for each sampler (especially for failed ones).
Once you figure out what's wrong with this particular request - amend JMeter configuration so it would be successful. Repeat until you're happy with the test end-to-end.
Add load only after that and be careful as test might be sensitive to extra users/loops, especially if you're using a single login account (which is not recommended)
References:
How to Handle Correlation in JMeter
How to Run Performance Tests on OAuth Secured Apps with JMeter
Hey guys kind of a n00b in App engine and I have been strugling with this is there a way that I can add/bulk default data to Data Store.
I would like to create catalogs or example data, as well user or permission. I am not using the default App engine user instead I am using webapp2 User auth session base model.
Thanks
You can use the bulkloader: https://developers.google.com/appengine/docs/python/tools/uploadingdata
Or upload data to the blobstore and move it to the datastore.
This is a large topic but, I am using Java code running in task queues to do this.
Much easier to create random test and demo data through code.
Much more friendly to unit testing.
This requires no dependencies. It is just code running and accessing the datastore.
Sometimes easier to manipulate the datastore through code instead of scripts when logic is involved in the changes.
Allows us to upload new task definitions (a Java classes) embedded in a new app version. Then, we trigger the tasks executions by calling a servlet URL. These task classes are then removed from the next app version.
And using tasks, you get around the request execution timeout. If a task is long running, we split it as sequential tasks. When a task completes, it queues the next one automatically.
Of course, this requires a fair amount of coding but is really simple and flexible at the same time.