We are maintaining the definition of our Logic App in an ARM template. The app has multiple email triggers. Currently, each trigger is hard-coded in the ARM template. The solution is growing and we'd like to keep adding triggers, ideally without updating the arm.
Is it possible to create triggers dynamically at deployment time, basing on the list provided as parameter? E.g. email_trigger_1 for mailbox test1#test.com, email_trigger_2 for mailbox test2#test.com etc.
I'm looking for smth similar to arm-copy, which doesn't work in this case.
You can try using a PowerShell automation script. I assume you have your ARM template already ready ( for reference follow this ).
Create a PowerShell script and put the deployment command : https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-deploy-azure-resource-manager-templates#deploy-with-azure-powershell
Capture the status of the deployment. If it returns success then update the deployment ( ARM) template using PowerShell commands and save the file : How do I update JSON file using PowerShell
https://intellipaat.com/community/10659/update-json-file-using-powershell
While updating the template you can update the counters for the required trigger names and other properties for your next deployment
Related
One of my Logic Apps is triggerd by the SFTP trigger. Today I changed the watch folder to a different folder, and I can't get it to trigger any more. I had the issue previously and had to create a new SFTP connector. This time I was able to resolve it by creating a new Logic App with all the same logic as the original. This is simply not acceptable. Its as if when programming if your program stops working your only recourse is to write it from scratch. How is one supposed to diagnose a trigger not triggering? I don't want to be faced with trowing away all my work again in the future.
Did you change the path using design view or code view?
When using the Code View, You need to make sure you are also updating the folderId and the metadata properties.
folderId is a Base64 encoding of the path. The same for the first property the metadata object.
When using the designer view, it seems that the trigger state (required to be able to recognise new items) is better refreshed when browsing through the folders on the SFTP trigger box (as opposed to just changing the path string)
HTH
I have business logic coded in Groovy and stored in a database table. I would love to edit this code inside Intellij IDEA with all the code completion possibilities etc. provided by this great IDE.
I could copy the script from the database table into a temporary file; edit it and store it back into the database. I could also write a plugin. But is there maybe already a way to do this with IDEA?
As I know, you can create "External tool", calling a simple import/export script, that will allow you to automate some tasks, you'd like.
Writing plugin isn't very simple task, and is an overhead as I think.
Also, you can use some wrapper inside your application, to work with debug configuration, when buisness logic scripts are loaded directly from files. It will also allow you to autoreload them, using groovy integration features. GroovyScriptEngine
I'm developing a Salesforce package that depends on some prepopulated data to work correctly (ie: a list of countries to populate a custom setting).
Is there a way to prepopulate these objects at installation/upgrade time? (e.g. uploading a csv with the data I need to insert into some custom objects).
Is there a way to run a custom script at installation/upgrade time? (e.g. have the script update information on new fields, or adapt existing data to a modified object structure).
Thanks in advance.
This is actually a new piece of functionality that is coming in the Summer '12 (API Version 25.0) release. There are two new interfaces to implement, InstallHandler and UninstallHandler, which can be setup to run on install and uninstall of a package respectively. You could implement the InstallHandler and populate the objects/custom settings in that class.
An alternative is to use a custom settings value to know if the installation procedure was run. Then you can use your package's point of entry to check for it and do the procedure if the value indicates it needs to run. It's a little complicated if you don't have a single point of entry.
This is related to Wix:
I have a situation in which I have to deploy a file into multiple directories whose values being fetched from registry. Now these directories could be from 1 to many.
And I don't want to create too many Directory entries whose values would be determined at runtime.
Can we call a custom action in a loop which would be detecting the target Directories and setting-up our target folder values?
I know we can do such copying inside a Custom Action. But I'm looking for a way to do this via WIX entries.
I was reading about DuplicateFiles Action but not getting some proper methodology to achieve my goal.
Thanks a lot
The WiX element CopyFile maps to the DuplicateFiles action. You can use AppSearch to set properties and then use CopyFile to duplicate a file to a directory. DuplicateFiles is smart enough to not do anything if the property is null.
If the number of copies is known when you create your installer you can just do that. If you think it's going to somehow be more dynamic at runtime, you can write a custom action that emits temporary rows to the DuplicateFile table that way DuplicateFiles and RemoveDuplicateFiles still does the heavy lifting.
You can use the principals described at Dynamic Windows Installer UI.
I'd like to know your approach/experiences when it's time to initially populate the Grails DB that will hold your app data. Assuming you have CSVs with data, is is "safer" to create a script (with whatever tool fits you) that:
1.-Generates the Bootstrap commands with the domain classes, run it in test or dev environment and then use the native db commands to export it to prod?
2.-Create the DB's insert script assuming GORM's version = 0 and incrementing manually the soon-to-be autogenerated IDs ?
My fear is that the second approach may lead to inconsistencies for hibernate will have the responsability for the IDs generation and there may be something else I'm missing.
Thanks in advance.
Take a look at this link. This allows you to run groovy scripts in the normal grails context giving you access to all grails features including GORM. I'm currently importing data from a legacy database and have found that writing a Groovy script using the Groovy SQL interface to pull out the data then putting that data in domain objects appears to be the easiest thing to do. Once you have the data imported you just use the commands specific to your database system to move that data to the production database.
Update:
Apparently the updated entry referenced from the blog entry I link to no longer exists. I was able to get this working using code at the following link which is also referenced in the comments.
http://pastie.org/180868
Finally it seems that the simplest solution is to consider that GORM as of the current release (1.2) uses a single sequence for all auto-generated ids. So considering this when creating whatever scripts you need (in the language of your preference) should suffice. I understand it's planned for 1.3 release that every table has its own sequence.