Jenkins Declarative Pipeline - user variables in params - jenkins-plugins

I have a Jenkinsfle. I need to pass parameters from Build with Parameters plugin and also have variables defined within the script. I cannot get either to work. It may be a syntax issue?
#!/usr/bin/env groovy
pipeline {
agent any
stages {
stage('Config API (dev)') {
steps {
script {
apiName = "config_API"
taskDefinitionFamily = "mis-core-dev-config"
taskDefinition = "mis-core-dev-config"
if (params.apiName.contains('Stop Task')) {
build(job: 'Stop ECS Task (utility)',
parameters: [
string(name: 'region', value: params.region),
string(name: 'cluster', value: params.cluster),
string(name: 'family', value: params.taskDefinitionFamily)
])
}
else if (params."${apiName}".contains('Start Task')) {
build(job: 'Start ECS Task (utility)',
parameters: [
string(name: 'region', value: params."${region}"),
string(name: 'cluster', value: params."${cluster}"),
string(name: 'taskDefinition', value: params."${taskDefinition}"),
string(name: 'containerInstanceIds', value: params."${containerInstanceIdsToStartOn}")
])
}
else if (params."${apiName}" == null || params."${apiName}" == "") {
echo "Did you forget to check a box?"
}
}
}
}
My Build with parameters variables are set in the GUI as `string variables,
containerInstanceIdsToStartOn = "463b8b6f-9388-4fbd-8257-b056e28c0a43"
region = "eu-west-1"
cluster = "mis-core-dev"
Where am I going wrong?

Define your parameters in a parameter block:
pipeline {
agent any
parameters {
string(defaultValue: 'us-west-2', description: 'Provide your region', name: 'REGION')
}
stages {
stage('declarative'){
steps {
print params.REGION
sh "echo ${params.REGION}"
}
}
stage('scripted'){
steps {
script {
print params.REGION
}
}
}
}
}
Output:
[Pipeline] {
[Pipeline] stage
[Pipeline] { (declarative)
[Pipeline] echo
us-west-2
[Pipeline] sh
[test] Running shell script
+ echo us-west-2
us-west-2
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (scripted)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
us-west-2
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS

Related

Iterate parallel loop dynamic stage based on text file content jenkins pipeline

I'm a newbie to jenkins dsl groovy scripting. i have a parameterized jenkins job which takes two inputs $param1 $param2
I have two stages. First stage generates output.txt file which has contents like below in the workspace. The output.txt file changes content based on shell script execution in stage1. So, the values are dynamic
output.txt
svn-test
svn_prod
svn_dev
Second stage has to get input from the file output.txt and iterate in a parallel loop dynamically creating stages. I have the below code but it doesn't take input from output.txt file. I'm unable to overirde the array in the stage and iterate parallely
def jobs = []
def parallelStagesMap = jobs.collectEntries {
["${it}" : generateStage(it)]
}
def generateStage(job) {
return {
stage("stage: ${job}") {
script{
git credentialsId: 'github', url: 'ssh://github.com/promp/${job}.git', branch: master
echo "This is ${job}."
sh ''' make ${parameter1}#"${paramete2}" '''
}
}
}
}
pipeline {
agent any
parameters {
parameters {
string(name: 'parameter1', defaultValue: 'UIAutomation', description: 'Please enter the value')
string(name: 'parameter2', defaultValue: 'UISite', description: 'Please enter the value')
}
stages {
stage('non-parallel stage') {
steps {
script {
echo 'This stage will be executed first.'
sh '''./rotate_script.sh output.txt'''
}
}
}
stage('parallel stage') {
failFast false
steps {
script {
def filePath = readFile('output.txt').trim()
def lines = filePath.readLines()
line.each {
// I have tried to read lines and pass it value. It didn't workout``
}
parallel parallelStagesMap
}
}
}
}
}
Ideally this is how my one of the second stage looks like and create multiple parallel stages based on output.txt file
stage('svn-test'){
steps{
sh 'mkdir -p svn-test'
dir("svn-test"){
script{
git credentialsId: 'github', url: 'ssh://github.com/promp/svn-test.git', branch: master
sh ''' make ${parameter1}#"${parameter2}"
'''
}
}
}
}
I finally got something like this to work. I had to move my Groovy List and the parallelStageMap into my pipeline. The StageMap is getting set at the start of your script when your list is empty, thus you are never getting any results. If you move it AFTER the list is populated it will work.
def generateStage(job) {
return {
stage("stage: ${job}") {
script{
git credentialsId: 'github', url: 'ssh://github.com/promp/${job}.git', branch: master
echo "This is ${job}."
sh ''' make ${parameter1}#"${paramete2}" '''
}
}
}
}
pipeline {
agent any
parameters {
string(name: 'parameter1', defaultValue: 'UIAutomation', description: 'Please enter the value')
string(name: 'parameter2', defaultValue: 'UISite', description: 'Please enter the value')
}
stages {
stage('non-parallel stage') {
steps {
script {
echo 'This stage will be executed first.'
sh '''./rotate_script.sh output.txt'''
}
}
}
stage('parallel stage') {
failFast false
steps {
script {
def jobs=[]
def filePath = readFile('output.txt').trim()
def lines = filePath.readLines()
lines.each { job ->
jobs.add("${job}")
}
def parallelStagesMap = jobs.collectEntries {
["${it}" : generateStage(it)]
}
parallel parallelStagesMap
}
}
}
}
}

Extract version and name from Package.json using Jenkinsfile (Jenkins Pipeline)

My question is simple I want to extract version and name from package.json, but when I extract version and name I got more URL see attached file.Why is that?
Jenkinsfile
pipeline {
agent any
environment {
CI = 'true'
//IMAGE = bat 'node -e "console.log(require(`./package.json`).name);"'
//VERSION = bat(script: 'npm run get-version')
//VERSION = bat '(npm run version --silent)'
//PACKAGE_VERSION = bat '(node -p -e "require(\'./package.json\').version")'
GIT_COMMIT_SHORT_HASH = GIT_COMMIT.take(7)
REPOSITORY = 'repo.dimiroma.com'
PORT = '8085'
LATEST = 'latest'
}
stages {
stage('Set Build Variables') {
steps {
script {
VERSION = bat(script: '''node -e "console.log(require('./package.json').version)"''', returnStdout: true).trim()
def getProjectName = { ->
return bat(
returnStdout: true,
script: 'node -e "console.log(require(\'./package.json\').name);"'
).trim()
}
//VERSION = getProjectVersion()
IMAGE = getProjectName()
}
}
}
stage('Information') {
steps {
script{
bat 'node -v'
bat 'git --version'
bat 'docker -v'
echo "JOB BASE NAME: ${JOB_BASE_NAME} BUILD-NUMBER: ${BUILD_NUMBER}"
echo "Version: ${VERSION}"
//echo "Version: ${PACKAGE_VERSION}"
echo "Name: ${IMAGE}"
echo "Branch_name: ${env.BRANCH_NAME}"
final scmVars = checkout(scm)
echo "scmVars: ${scmVars}"
echo "scmVars.GIT_COMMIT: ${scmVars.GIT_COMMIT}"
echo "scmVars.GIT_BRANCH: ${scmVars.GIT_BRANCH}"
}
}
}
stage('Install Dependencies') {
steps {
bat 'npm install'
}
}
stage('Test') {
steps {
bat 'npm test -- --coverage a'
}
}
stage('Create Docker Image'){
steps {
bat "docker images"
bat "docker build . -t ${IMAGE}:${VERSION}-${GIT_COMMIT_SHORT_HASH}"
}
}
}
}
Dockerfile
Please Help.
You could do it like this. I assume you got npm as a tool installed for this
stage("Build and push docker non-release"){
steps {
script {
def version = sh(returnStdout: true, script: "npm version")
echo "Version is ${version}"
def versionProps = readJSON text: version
echo "Project version is ${versionProps['project-name']}"
}
}
}

Dynamically generate tasks in a scripted Jenkin pipeline for parallel execution always with same index

I work an a scripted jenkins pipeline to shut down environments within our cloud providers. Since the environments are independent from each other, I want to execute them in parallel.
Here is the code:
def environmentsArray=["ZULI-TestCenter", "ZULI-FinanceCenter"]
node("someNode") {
def parEx = [:]
for (def item : environmentsArray) {
def environment = item
parEx[environment]= {
// Identify environments
environmentLowerCase = environment.split('-')[0].toLowerCase()
appNameLowerCase = environment.split('-')[1].toLowerCase()
stage("...) {
}
stage("...") {
}
} // End of parEx
} // End of for
parallel parEx
} // End of node
Unfortunately the values environmentLowerCase and appNameLowerCase are not updated per iteration, i.e. they have always the same value:
[Pipeline] { (STOPPING DMS tasks: ZULI-TestCenter) // correct
[Pipeline] stage
[Pipeline] { (STOPPING DMS tasks: ZULI-FinanceCenter) // correct
[Pipeline] echo
zuli, financecenter // wrong, should be testcenter
[Pipeline] withCredentials
[Pipeline] echo
zuli, financecenter // correct
What am I doing wrong?
As mentioned by #daggett. I simply hat to add def:
def environmentsArray=["ZULI-TestCenter", "ZULI-FinanceCenter"]
node("someNode") {
def parEx = [:]
for (def item : environmentsArray) {
def environment = item
parEx[environment]= {
// Identify environments
def environmentLowerCase = environment.split('-')[0].toLowerCase() // added def
def appNameLowerCase = environment.split('-')[1].toLowerCase() // added def
stage("...) {
}
stage("...") {
}
} // End of parEx
} // End of for
parallel parEx
} // End of node

ansible 2.4 how to skip install of apt package if already installed with "with_item" loop and "when" condition

i would like please to know if there is a way in a loop with_item.
to check and skip install of apt dependencies
if already installed with "when" condition
if im adding the "when" condition in the end of the with_item list , its look like the condition check all the list instead of checking just the relevant - in this example python2
- name: check if python already installed
shell: dpkg-query -W python2.7
register: check_python2
ignore_errors: True
- name: Install apt dependencies
apt:
name: "{{item.name}}{{item.version}}"
state: present
allow_unauthenticated: yes
force: yes
with_items:
- { name: 'python2.7', version: '' }
- { name: 'ruby', version: '' }
- { name: 'postgresql-9.5', version: '' }
- { name: 'postgresql-contrib-9.5', version: '' }
- { name: 'libpq-dev', version: '' }
- { name: 'nodejs', version: '=9.*' }
- { name: 'python-setuptools', version: '' }
- { name: 'python-pip', version: '' }
- { name: 'python-pkg-resources', version: '' }
- { name: 'sshpass', version: '' }
- { name: 'zip', version: '' }
- { name: 'mongodb-org', version: '=4.0.0' }
- { name: 'libfontconfig', version: '' }
- { name: 'ntp', version: '' }
- { name: 'fio', version: '' }
when: check_python2.rc != 0
when: check_ruby.rc != 0
how can i add the "when" condition to check only the right dependencies
i would like to check all the dependencies:
and if one of them not installed , install them otherwise skip
I'm not sure I understand question, do you mean both conditions "check_python2.rc != 0" and "check_ruby.rc != 0" ?
when: check_python2.rc != 0 and check_ruby.rc != 0
- hosts: all:!
gather_facts: False
vars:
packages:
- python2.7
- ruby
- postgresql-9.5
- postgresql-contrib-9.5
- libpq-dev
- nodejs
- python-setuptools
- python-pip
- python-pkg-resources
- sshpass
- zip
- mongodb-org=4.0.0
- libfontconfig
- ntp
- fio
tasks:
- name: "Install dependencies"
become: yes
allow_unauthenticated: yes
force: yes
apt:
pkg: "{{ packages }}"
state: present

Log to file with gradle 4

Until gradle 3, I used this to write gradle's output to a logfile:
def fileLogger = [
onOutput : {
File logfile = new File( 'gradle.log' )
logfile << it
}
] as org.gradle.api.logging.StandardOutputListener
gradle.useLogger( fileLogger )
This does not work with with gradle 4.
Update for Gradle 5:
It works when using logging.addStandardOutputListener instead gradle.useLogger and adding it to all tasks:
// logger
def fileLogger = [
onOutput: {
File logfile = new File('gradle.log')
logfile << it
}
] as org.gradle.api.logging.StandardOutputListener
// for configuration phase
logging.addStandardOutputListener(fileLogger)
// for execution phase
gradle.taskGraph.whenReady { taskGraph ->
taskGraph.allTasks.each { Task t ->
t.doFirst {
logging.addStandardOutputListener(fileLogger)
}
}
}

Resources