How to use Jenkins file parameter in Pipelines - file

My Jenkins pipeline runs on the Slave using agent { node { label 'slave_node1' } }.
I use Jenkins file parameter named uploaded_file and upload a file called hello.pdf
My pipeline contains the following code
stage('Precheck')
{
steps {
sh "echo ${WORKSPACE}"
sh "echo ${uploaded_file}
sh "ls -ltr ${WORKSPACE}/*"
Output:
/web/jenkins/workspace/MYCOPY
hello.pdf
ls: cannot access /web/jenkins/workspace/MYCOPY/* No such file or directory
As you can see no files were found on the slave WORKSPACE.
Can you let me understand if I'm checking for the uploaded file in the correct location i.e under WORKSPACE directory?
How can I get the file uploaded to the slave's WORKSPACE?
I'm on jenkins version 2.249.1
Can I get this to work at least on the latest version of Jenkins ?

So do you have a fixed file that that is copied in every build? I.e its the same file?
In that case you can save it as a secret file in jenkins and do the following:
environment {
FILE = credentials('my_file')
}
stages {
stage('Preperation'){
steps {
// Copy your fie to the workspace
sh "cp ${FILE} ${WORKSPACE}"
// Verify the file is copied.
sh "ls -la"
}
}
}

Related

Is there any way to list out files from Hadoop hdfs and store only the file names to the local and not the actual file itself?

Is there any way to list out files from Hadoop hdfs and store only the file names to the local?
example:
I have a file india_20210517_20210523.csv. I m currently copying the files from hdfs to local using copytolocal command but copying files to local is time-consuming as files are huge. All I need is the name of the files to be stored in a .txt file to perform cut operations using bash script.
Kindly help me
The easiest way to do is to use the below command.
hdfs dfs -ls /path/fileNames | awk '{print $8}' | xargs -n 1 basename > Output.txt
How it works:
hdfs dfs -ls : This will list all the information about the path
awk '{print $8}' : To print the 8th column of the output
xargs -n 1 basename : To get the file names alone excluding the path
> Output.txt : To store the file names to a text file
Hope this answers your question.
If you want to do this programmatically, you can use FileSystem and FileStatus objects from Hadoop to:
list the contents of your (current or another) target directory,
check if each of the records of this directory is either a file or another directory, and
write the name of each file as a new line to a file stored locally.
The code for this type of application can look like this:
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import java.io.File;
import java.io.PrintWriter;
public class Dir_ls
{
public static void main(String[] args) throws Exception
{
// get input directory as a command-line argument
Path inputDir = new Path(args[0]);
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(conf);
if(fs.exists(inputDir))
{
// list directory's contents
FileStatus[] fileList = fs.listStatus(inputDir);
// create file and its writer
PrintWriter pw = new PrintWriter(new File("output.txt"));
// scan each record of the contents of the input directory
for(FileStatus file : fileList)
{
if(!file.isDirectory()) // only take into account files
{
System.out.println(file.getPath().getName());
pw.write(file.getPath().getName() + "\n");
}
}
pw.close();
}
else
System.out.println("Directory named \"" + args[0] + "\" doesn't exist.");
}
}
So if we want to list the files from the root (.) directory of HDFS, and we have these as the contents under it (notice how we both have directories and text files):
This will be the command line output of the application:
And this will be what's written inside the output.txt text file stored locally:

Save commit message strng value in an envionment variable via Jenkins with bat (Windows) and using Pipeline?

I would like to save the value of some string (in this case the commit message from Git) as an environment variable for a multibranch pipeline in Jenkins. This is part of my my pipeline:
pipeline {
agent any
environment {
GIT_MESSAGE = """${bat(
script: 'git log --no-walk --format=format:%s ${%GIT_COMMIT%}',
returnStdout: true
)}""".trim()
}
stages {
stage('Environment A'){
steps{
bat 'echo %GIT_MESSAGE%'
bat '%GIT_MESSAGE%'
}
}
...
}
But after this, the echo %GIT_MESSAGE% is retruning:
echo D:\.jenkins\workspace\folder log --no-walk --format=format:GIT_COMMIT} 1>git
And naturally if I run it with bat '%GIT_MESSAGE%' it fails. I know part of the answer may lay in the way to pass the environment variable to the bat script ${%GIT_COMMIT%} but I do not seem to be able to figure out how.
Any ideas?
I just solved this issue. It had to do with the way groovy performs string interpolation. I left it working with single line strings (i.e. "...") but I am pretty sure it should work with multiline strings ("""...""").
This is the working solution for now:
pipeline {
agent any
environment {
GIT_MESSAGE = "${bat(script: "git log --no-walk --format=format:%%s ${GIT_COMMIT}", returnStdout: true)}".readLines().drop(2).join(" ")
}
stages {
stage('Environment A'){
steps{
bat 'echo %GIT_MESSAGE%'
bat '%GIT_MESSAGE%'
}
}
...
}
Notice that readLines(), drop(2) and join(" ") where necessary in order to get the commit message only without the path from which the command was run.
Also it was important to use "..." inside the script parameter of the bat function, otherwise interpolation does not happen and the environment variable GIT_COMMIT would have not been recognized.

Fetching uploaded files in Jenkins [duplicate]

This question already has an answer here:
Upload file in Jenkins input step to workspace
(1 answer)
Closed 3 years ago.
Upload (per progress bottom bar status) goes through in Jenkins. But don't see option to locate or fetch uploaded file from Jenkins.
Here is simple script with File parameters,
properties(
[
parameters(
[ file(name: "file1.zip", description: 'Choose path to upload file1.zip from local system.'),
file(name: "file2.zip", description: 'Choose path to upload file2.zip from local system.') ]
)
]
)
node {
stage("Fetch Uploaded File") {
sh '''
ls -l file1.zip file2.zip
ls -l ${WORKSPACE}/file1.zip ${WORKSPACE}/file2.zip
'''
}
}
Tried with def input file option per other post, but no luck of reaching uploaded file. Any inputs?
def inputFile = input message: 'Upload file', parameters: [file(name: 'data.ear')]
new hudson.FilePath(new File("$workspace/data.ear")).copyFrom(inputFile)
inputFile.delete()
With scripted full pipeline pasted above, getting below error..
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Fetch Uploaded File)
[Pipeline] sh
[testSh] Running shell script
+ ls -l file1.zip file2.zip
ls: cannot access file1.zip: No such file or directory
ls: cannot access file2.zip: No such file or directory
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 2
Finished: FAILURE
This is a known bug JENKINS-27413.
There is however a library with a workaround that can help you https://github.com/janvrany/jenkinsci-unstashParam-library. As described in the readme you can add this library to your Jenkins (Extending with Shared Libraries) and use it the following way:
library "jenkinsci-unstashParam-library"
node {
def file_in_workspace = unstashParam "file"
sh "cat ${file_in_workspace}"
}

no permanent file created

I'm running a conf file in logstash which gives the output as required but when i open ls to list file it's not getting created.need help in resolving issue.
conf file:
input {
elasticsearch{
hosts =>"localhost:9200"
index=>"index1"
}
}
output{
file{
path => "/opt/elk/logstash-6.5.0/example.json"
}
stdout{}
}
Result of execution:
[2019-10-13T14:07:47,503][INFO ][logstash.outputs.file ]
Opening file {:path=>"/opt/elk/logstash-6.5.0/example.json"}
{
"#timestamp" => 2019-10-13T08:37:46.715Z,
"example" => [
//data within example//
}
[2019-10-13T16:44:19,982][INFO ][logstash.pipeline] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x4f8848b run>"}
but when I run ls to see example.json, it doesn't show up
ls command:
elastic#es-VirtualBox:~/opt/elk/logstash-6.5.0$ ls
bin CONTRIBUTORS fetch.conf Gemfile.lock lib logs logstash-core-plugin-api NOTICE.TXT throw.conf vendor config data Gemfile grokexample.conf LICENSE.txt logstash-core modules output.json tools x-pack
so was wondering if conf only creates a temporary file?

ipython %run testscript_in_pythonpath.py returns "no file found" error

I would like to use the %run magic command to run a script in a directory which is in the pythonpath variable. The script reads some files in the working directory. However, when I try to run the script using the command: %run "testscript_in_pythonpath.py ", it returns an error. I thought files in pythonpath would be accessible to the interpreter, no ??
(Reposting as an answer)
$PYTHONPATH is what Python uses to look up modules to import, not scripts to run.
To run a file from $PYTHONPATH, you can do import testscript_in_pythonpath. Or, in IPython:
%run -m testscript_in_pythonpath
The difference is that if the file has an if __name__ == '__main__': section, %run will trigger that.
From a system shell, you can do the same thing as:
python -m testscript_in_pythonpath

Resources