anyone has past experience with gradle? i'm thinking of using it for continuous deployment... i'm considering either using my own scripts (python) or gradle.
can anyone tell from experience which way he thinks recommanded to go? note i already use maven and i don't intend to move away for my dependency management and project management.
thanks
We have implemented Gradle-based deployment and environment management in a big governmental project (100+ servers). But we had to develop a custom set of plugins (which is actually rather straight forward process in Gradle) to handle tasks like remote SSH command execution through Groovy DSL, creation of application server domains/clusters (we are using WebLogic), application/configuration deployment.
We also are thinking of integrating Gradle with Puppet for easier Linux administration.
If you are coming from Java world, then using Gradle (which is Groovy-based) would be rather simple for you, because you can reuse your Java/Ant/Maven/Groovy knowledge to write scripts. Also an ability to create DSLs in Groovy may allow you to build interesting abstractions. Gradle also has very clean API which allows building nice dependencies between tasks. It also integrates very well with Maven infrastructure and you can reuse all Ant tasks.
Yes, Gradle-based deployment possible with gradle-ssh-plugin
Here is an article with good usage example.
Related
What is the best way to dynamically provide configuration to a vespa application?
It seems that the only method that is talked about is baking configuration values into the application package but is there any way to provide configuration values outside of that? ie are there cli tools to update individual configuration values at runtime?
Are there any recommendations or best practices for managing configuration across different environments (ie production vs development) ? At Oath/VMG is configuration checked into source control or managed outside of that?
Typically all configuration changes are made by deploying an updated application package. As you suggest, this is usually done by a CI/CD setup which builds and deploys the application package from a git repository whenever that changes.
This way it is easy to ensure changes have been reviewed (before merge), track all changes that have been made and roll them back if necessary. It is also easy to verify that the same changes which have been deployed and tested (preferably by automated tests) in a development / test environment are the ones that are deployed to production - because the same application package is deployed through each of those environments in order.
It is however also possible to update files in a deployed application package and create a new session from this, which may be useful if your application package has some huge resources. See https://docs.vespa.ai/documentation/cloudconfig/deploy-rest-api-v2.html#use-case-modify
I have created a Web Content Management library for use in WebSphere Portal. At the moment I'm using import-wcm-data to import the library, then I need to add some additional propeties to 2-3 files on the server under Resource Environment Providers and then restart particular services so those changes are detected.
Can anyone explain the benefits of using a paa over writing a simple bash (or similar) script to automate this process?
I don't understand if I get any advantages when using paa, or is paa even capable of updating properties files and restarting services?
I have been working intensively with PAA files and I must say that it is a very stable way of deploying a app requirering multiple depl steps and components.
It does need a startup process but is well worth it in a multi server environment.
You can do all the tasks that you can do in a Ant file as well as using the wsadmin script interface. I only update res env settings and the such in WAS and do not touch any props files for that reason since all settings are stored in WAS.
In my experience, a PAA is not a good method if you're merely importing a content library.
I don't think I understand why you are doing the import manually and not syndicating, but even if there's a good reason not to syndicate, the PAA process was too involved and required too many precursor actions (deleting libraries, remove PAA, deploy PAA and then activate the portliest) to be a viable option for something as simple as importing a WCM library.
Since activating the portlets I was importing with the PAA was an extra step, I don't believe you can restart applications either.
I'm working on a very simple web app, written in Go language.
I have a standalone version and now port it to GAE. It seems like there is very small changes, mainly concerning datastore API (in the standalone version I need just files).
I also need to include appengine packages and use init() instead of main().
Is there any simple way to merge both versions? As there is no preprocessor in Go, it seems like I must write a GAE-compatible API for the standalone version and use this mock module for standalone build and use real API for GAE version. But it sounds like an overkill to me.
Another problem is that GAE might be using older Go version (e.g. now recent Go release uses new template package, but GAE uses older one, and they are incompatible). So, is there any change to handle such differences at build time or on runtime?
Thanks,
Serge
UPD: Now GAE uses the same Go version (r60), as the stable standalone compiler, so the abstraction level is really simple now.
In broad terms, use abstraction. Provide interfaces for persistence, and write two implementations for that, one based on the datastore, and one based on local files. Then, write a separate main/init module for each platform, which instantiates the appropriate persistence interface, and passes it to your main application to use.
My immediate answer would be (if you want to maintain both GAE and non-GAE versions) that you use a reliable VCS which is good at merging (probably git or hg), and maintain separate branches for each version. The GAE API fits in reasonably well with Go, so there shouldn't be too many changes.
As for the issue of different versions, you should probably maintain code in the GAE version and use gofix (which is unfortunately one-way) to make a release-compatible version. The only place where this is likely to cause trouble is if you use the template package, which is in the process of being deprecated; if necessary you could include the new template package in your GAE bundle.
If you end up with GAE code which you don't want to run on Google's servers, you can also look into AppScale.
When creating an auto updating feature for a .NET WinForms application, how does it update the DLLs and not affect the currently running application?
Since the application is running during the update process, won't there be a lock on the DLLs (because those DLLs will have to be overwritten during the update).
Usually you would download the new files into a separate area. Then shutdown and restart and at startup you look for and use the new files if found. Always keeping a last known working version on the side so that the user can revert to something that definitely works if the download causes problems.
ClickOnce is a good technology from Microsoft that does this for you and you can use it directly from Visual Studio 2008.
You'll have to shutdown your application and restart it, as other people have already commented.
I wrote an open-source code to do just that in a transparent mode - including an external update application to do the actual cold update. See http://www.code972.com/blog/2010/08/nappupdate-application-auto-update-framework-for-dotnet/
The code is at http://github.com/synhershko/NAppUpdate (Licensed under the Apache 2.0 license)
I have a seperate 'launcher' application that checks for updates via a web service. If there are updates, it downloads them and then executes my application, which is in a seperate assembly.
The other alternatives are using things like ClickOnce, or downloading the files to a seperate area and restarting the app, as someone else mentioned.
Be warned about ClickOnce, though - it's not as flexible as it sounds. And if you deploy to a system that requires elevating your program to a higer security level to run, you might run into problems if you don't have a certificate for your app installed. I found it very difficult to get straight answers on the Internet to things like certificate management when it comes to ClickOnce. If you have a complex app, you may want to just roll your own updater, which is what I ended up having to do.
If you publish via ClickOnce, all of that tends to be handled for you. It has it's own pro's and con's but usually easier than trying to code it all yourself.
Both Wikipedia and 15seconds have decent info on using ClickOnce, how it works, etc.
As others have stated, ClickOnce isn't as flexible as rolling your own solution but it is a LOT less complicated. It has a small learning curve at first, but with pretty much everything bundled into Visual Studio and the use of Wizards, it usually doesn't take long to stumble onto a working solution.
As deployments get more complex (i.e. beyond than just having prerequisites or application code that needs updating) and you need to do a lot of post-install or pre-install tasks, there are things like WiX which give you somewhat of a hybrid solution between Windows Installer and ClickOnce, with the cost of flexibility being a much steeper learning curve.
The only reason I try to avoid custom installers is that you end up spending way too much time trying to get it just right to handle a bunch of different "What If" scenarios...
These days Windows can do such updates automatically for you with AppInstaller if your app is packaged in the MSIX package.
It downloads the new version of the app in another folder inside ProgramFiles\WindowsApps, then when a user runs the app via the start menu, the system knows what folder it should use. The previous version gets deleted when not in use.
If you want to know how to package your app this way I collected my findings in this answer.
Our team develops distributed winform apps. We use ClickOnce for deployment and are very pleased with it.
However, we've found the pain point with ClickOnce is in creating the deployments. We have the standard dev/test/production environments and need to be able to create deployments for each of these that install and update separate from one another. Also, we want control over what assemblies get deployed. Just because an assembly was compiled doesn't mean we want it deployed.
The obvious first choice for creating deployments is Visual Studio. However, VS really doesn't address the issues stated. The next in line is the SDK tool, Mage. Mage works OK but creating deployments is rather tedious and we don't want every developer having our code signing certificate and password.
What we ended up doing was rolling our own deployment app that uses the command line version of Mage to create the ClickOnce manifest files.
I'm satisfied with our current solution but is seems like there would be an industry-wide, accepted approach to this problem. Is there?
I would look at using msbuild. It has built in tasks for handling clickonce deployments. I included some references which will help you get started, if you want to go down this path. It is what I use and I have found it to fit my needs. With a good build process using msbuild, you should be able to accomplish squashing the pains you have felt.
Here is detailed post on how ClickOnce manifest generation works with MsBuild.
I've used nAnt to run the overall build strategy, but pass parameters into MSBuild to compile and create the deployment package.
Basically, nAnt calls into MSBuild for each environment you need to deploy to, and generates a separate deployment output for each. You end up with a folder and all ClickOnce files you need for every environment, which you can just copy out to the server.
This is how we handled multiple production environments as well -- we had separate instances of our application for the US, Canada, and Europe, so each build would end up creating nine deployments, three each for dev, qa, and prod.