download multiple files from sftp with Jenkins - batch-file

I have to download all files from a ftp folder using Explicit FTP over SSL/TLS. I need that for a jenkins job, running on a windows machine and didnt find any plugins - so I am trying to use a batch script with curl and the following code lists the contents of the folder.
set "$FILEPATH=C:\temp"
set "$REMOTEPATH=/files/"
curl -u user:pass --ftp-ssl ftp://hostame.com:port%$REMOTEPATH% -o %$FILEPATH%
I figured out that with curl I have to download files one by one, but how can I achieve to go through all the files in a ftp directory and get them one by one?
Is there a better way to achieve that? I read about mget, but it doesnt seem to work with the explicit ftp over ssl.
Thanks

I couldnt bring it to work with batch directly in the script, so I wrote a python script instead and download it from git and execute it as a step in the pypeline. It has some nice libraries, so it works as a charm.

Related

What "option" to use with "WGET" for selecting only few files with particular extension from a FTP directory

I am trying to download files with particular datestamp as an extension from a folder through FTP server. Since the folder contains all other files, I wanted to download only files with a particular datestamp.
I tried using wget files_datestamp*.extension, which didn't work.
I also tried using wget -i files_datestamp*.extension, which downloads all.
My question is: What option to use with wget to download only particular files that I am interested in?
wget http://collaboration.cmc.ec.gc.ca/cmc/CMOI/NetCDF/NMME/1p0deg/#%23%23/CanCM3_201904_r4i1p1_20190501*.nc4
The link you've shared is over HTTP and not FTP. As a result, it is not possible to glob over the filenames, that is feasible only over FTP.
With HTTP, it is imperative that you have access to a directory listing page which tells you which files are available. Then use -r --accept-regex=<regex here> to download your files

how to install check_inode plugin in nagios

i have to install a plugin on a red hat server where nagios is already configured.
the plugin to be installed is inode_checker which i got from this link
how to install inode checker in nagios
but when i opened this link i could find a shell script here.
now i am not sure whether i have to place the shell script directly on the server in the location /usr/local/nagios/libexec/ or is there any other way to do it since the other plugins available in this location seems to be different and i am not able to open them.
what am i doing wrong here?please advise.
Yes, this is a bash script so simply download and place it in the folder where you have other scripts sitting. Make sure to make it executable like
chmod +x scriptname
Then you should be able to use it in nagios by creating a Command object. You can find the location of the folder where your scripts are located by looking at the resources.cfg file which should hold something like below:
$USER1$=/usr/lib64/nagios/plugins
Hope this helps.

how to create folder in sonatype nexus repository through command line

Is there a way to create folder and copy artifacts into the created folders in sonatype nexus repository, through windows command line or batch files?
I have finally got solution to my problem by curl functionality.
curl -u admin:admin123 -T C:\upload\Test.txt http://Nexus_Repo_URL/folder_to_be_created/Test.txt
"folder_to_be_created" is the folder which is created in the repository and the file, 'Test.txt', is copied to it
Just upload the artifacts to whatever path is needed using one of the methods described here:
https://support.sonatype.com/hc/en-us/articles/213465818-How-can-I-programatically-upload-an-artifact-into-Nexus-
Any folders needed will be created automatically.

Downloading artifacts from Jenkins using wget or curl

I am trying to download an artifact from a Jenkins project using a DOS batch script. The reason that this is more than trivial is that my artifact is a ZIP file which includes the Jenkins build number in its name, hence I don't know the exact file name.
My current plan of attack is to use wget pointing at: /lastSuccessfulBuild/artifact/
to do some sort of recursive/mirror download.
If I do the following:
wget -r -np -l 1 -A zip --auth-no-challenge --http-user=**** --http-password=**** http://*.*.*.*:8080/job/MyProject/lastSuccessfulBuild/artifact/
(*s are chars I've changed for posting to SO)
I never get a ZIP file. If I omit the -A zip option, I do get the index.html, so I think the authorisation is working, unless it's some sort of session caching issue?
With -A zip I get as part of the response:
Removing ...+8080/job/MyProject/lastSuccessfulBuild/artifact/index.html since it should be rejected.
So I'm not sure if maybe it's removing that file and so not following its links? But doing -A zip,html doesn't work either.
I've tried several wget options, and also curl, but I am getting nowhere.
I don't know if I have the wrong wget options or whether there is something special about Jenkins authentication.
You can add /*zip*/desired_archive_name.zip to any folder of the artifacts location.
If your ZIP file is the only artifact that the job archives, you can use:
http://*.*.*.*:8080/job/MyProject/lastSuccessfulBuild/artifact/*zip*/myfile.zip
where myfile.zip is just a name you assign to the downloadable archive, could be anything.
If you have multiple artifacts archived, you can either still get the ZIP file of all of them, and deal with individual ones on extraction. Or place the artifact that you want into a separate folder, and apply the /*zip*/ to that folder.

Write a batch file to download a .exe app from the internet

I have to write a batch file to download a .exe application and I am finding it very difficult to make sense of the whole process.
All I have got done so far is;
start /d C:"\Program Files <x86>\Google\Chrome\Application"
chrome.exe http://website/directory
This brings up the page I want to go to and the .exe file is on this page, but I don'y know how to download it, I tried;
start /d C:"\Program Files <x86>\Google\Chrome\Application"
chrome.exe http://website/directory/download.exe
This was no good, it tried to load the page, while I thought it would just download the file.
If anyone can give me some insight into this, it would be great
Do not use chrome. Depending on the tools you can rely on, use for example wget or curl. For documentation, have a look at the project's homepages (wget, curl), basic invokation is easy:
wget -o outfile http://example.com/url/to/file
curl -o outfile http://example.com/url/to/file
You may need to change:
http://www.
to
ftp://ftp.
It would help if you provided the actual internet file link.

Resources