Get credentials in pipeline script - jenkins-plugins

I'm using "Parameterized Build" and in particular "Credentials parameter"
In my Pipeline script how can I access to the username and password of TEST variable?
timeout(time: 5, unit: 'MINUTES') {
node('jenkins-slave1') {
....
sh 'echo ${TEST_USER}'
sh 'echo ${TEST_PASSWORD}'
}
}
I know that I can use inside the script
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: '5994c63b-ee86-4ef1-b28a-ee2236662226',
passwordVariable: 'TEST_PASSWORD',
usernameVariable: 'TEST_USER']]) {
sh "echo ${env.TEST_USER}"
sh "echo ${env.TEST_PASSWORD}"
}
but I need to provide this choice to the final user running the job
Thanks!
Riccardo

Related

How can I loop thought Jenkins Extended Choice Parameter

I have the below Extended Choice Parameter in my Job:
In case I choose both values, how can I loop thought it in my jenkinsfile?
sh "docker run --rm --net=host -v ${WORKSPACE}:/app/ ${MyImage} --env ${ENV}"
This is what I used (with the help of this post: Single parameter with multiple values - referencing extended-choice parameter values):
str = env.ENV.split(',')
for (String values :str)
sh "docker run --rm --net=host -v ${WORKSPACE}:/app/ ${MyImage} --env $values"

How to use Jenkins file parameter in Pipelines

My Jenkins pipeline runs on the Slave using agent { node { label 'slave_node1' } }.
I use Jenkins file parameter named uploaded_file and upload a file called hello.pdf
My pipeline contains the following code
stage('Precheck')
{
steps {
sh "echo ${WORKSPACE}"
sh "echo ${uploaded_file}
sh "ls -ltr ${WORKSPACE}/*"
Output:
/web/jenkins/workspace/MYCOPY
hello.pdf
ls: cannot access /web/jenkins/workspace/MYCOPY/* No such file or directory
As you can see no files were found on the slave WORKSPACE.
Can you let me understand if I'm checking for the uploaded file in the correct location i.e under WORKSPACE directory?
How can I get the file uploaded to the slave's WORKSPACE?
I'm on jenkins version 2.249.1
Can I get this to work at least on the latest version of Jenkins ?
So do you have a fixed file that that is copied in every build? I.e its the same file?
In that case you can save it as a secret file in jenkins and do the following:
environment {
FILE = credentials('my_file')
}
stages {
stage('Preperation'){
steps {
// Copy your fie to the workspace
sh "cp ${FILE} ${WORKSPACE}"
// Verify the file is copied.
sh "ls -la"
}
}
}

RUN Script after sql service up in dockerfile

I have a database for my app and I want to create it in run time docker
I have a file CreateDB.sh and it creates all the tables and stored procedure that I want.
I tried this :
FROM mcr.microsoft.com/mssql/server
ENV ACCEPT_EULA=Y \
SA_PASSWORD=qwe123QWE
USER root
RUN mkdir /home/db
COPY ./db /home/db
RUN chmod +x /home/db/DbScriptLinux.sh
WORKDIR /home/db/
CMD ["/bin/bash", "/home/db/DbScriptLinux.sh"]
but it returns an error :
LoginTimeout
is there any way to run my script after all services (sql-server) start?
You can use an if statement, for example, RUN if [[ -z "$arg" ]] ; then echo Argument not provided ; else echo Argument is $arg ; fi
Another way would be to use command1 && command2 so if the command 1 is successfull, then command 2 would run afterwards.
Your last line CMD ["/bin/bash", "/home/db/DbScriptLinux.sh"] if this is to start the database every time you start the container, as your default command to run, then should be alright, otherwise it would be better to use the RUN command.

Save commit message strng value in an envionment variable via Jenkins with bat (Windows) and using Pipeline?

I would like to save the value of some string (in this case the commit message from Git) as an environment variable for a multibranch pipeline in Jenkins. This is part of my my pipeline:
pipeline {
agent any
environment {
GIT_MESSAGE = """${bat(
script: 'git log --no-walk --format=format:%s ${%GIT_COMMIT%}',
returnStdout: true
)}""".trim()
}
stages {
stage('Environment A'){
steps{
bat 'echo %GIT_MESSAGE%'
bat '%GIT_MESSAGE%'
}
}
...
}
But after this, the echo %GIT_MESSAGE% is retruning:
echo D:\.jenkins\workspace\folder log --no-walk --format=format:GIT_COMMIT} 1>git
And naturally if I run it with bat '%GIT_MESSAGE%' it fails. I know part of the answer may lay in the way to pass the environment variable to the bat script ${%GIT_COMMIT%} but I do not seem to be able to figure out how.
Any ideas?
I just solved this issue. It had to do with the way groovy performs string interpolation. I left it working with single line strings (i.e. "...") but I am pretty sure it should work with multiline strings ("""...""").
This is the working solution for now:
pipeline {
agent any
environment {
GIT_MESSAGE = "${bat(script: "git log --no-walk --format=format:%%s ${GIT_COMMIT}", returnStdout: true)}".readLines().drop(2).join(" ")
}
stages {
stage('Environment A'){
steps{
bat 'echo %GIT_MESSAGE%'
bat '%GIT_MESSAGE%'
}
}
...
}
Notice that readLines(), drop(2) and join(" ") where necessary in order to get the commit message only without the path from which the command was run.
Also it was important to use "..." inside the script parameter of the bat function, otherwise interpolation does not happen and the environment variable GIT_COMMIT would have not been recognized.

Nomad task getting killed

I have two tasks in task group
1) a db task to bring up a db and
2) the app that needs the db to be up.
Both start in parallel and the db tasks takes a lil bit time but by then the app recognizes that db is not up and kills the db task. Any solutions? Please advise.
It's somewhat common to have an entrypoint script that checks if the db is healthy. Here's a script i've used before:
#!/bin/sh
set -e
cmd="$*"
postgres_ready() {
if test -z "${NO_DB}"
then
PGPASSWORD="${RDS_PASSWORD}" psql -h "${RDS_HOSTNAME}" -U "${RDS_USERNAME}" -d "${RDS_DB_NAME}" -c '\l'
return $?
else
echo "NO_DB Postgres will pretend to be up"
return 0
fi
}
until postgres_ready
do
>&2 echo "Postgres is unavailable - sleeping"
sleep 1
done
>&2 echo "Postgres is up - continuing..."
exec "${cmd}"
You could save it as entrypoint.sh and run it with your application start script as the argument. eg: entrypoint.sh python main.py

Resources