We are using Jenkins server for our daily build process and executes some bash scripts on remote hosts over SSH. This scripts are generating html log files on remote hosts.
We are using Copy to slave plugin to copy files on slave machines and Publish over ssh plugin to manage SSH sessions in build process.
Now the question is, We want to copy some files (log files of Scripts) from remote ssh host to Jenkins Server.
Which will be possible and better option for the same (plugin will be better if any).
EDIT :
sshpass is an option, but looking for any plugin or better way to do the job.
use sshpass command to send file in
Build Environment -> Execute Shell script on remote host using ssh ->
Post build script
sample command :
sshpass -p "password" scp path/of/file <new_server_ip>:/path/of/file
This will skip password prompt for scp command and will provide password to scp.
I think you can generate ssh keypair and pass it to the slave as a parameter with, for example, Config File Provider Plugin
Then just use scp to retrieve files using this keypair for authentication process.
Obviously way too late, but in case you're already using publish-over-ssh, want to avoid duplicating the credentials and have a shared library you can use this piece of groovy to obtain the host configuration:
import jenkins.plugins.publish_over_ssh.*
#NonCPS
def getSSHHost(name) {
def found = null
Jenkins.instance.getDescriptorByType(BapSshPublisherPlugin.Descriptor.class).each{
it.hostConfigurations.each{host ->
if (host.name == name) {
found = host
}
}
}
found
}
As mentioned, this either requires a Global Shared Library (so that your code is trusted) or (probably) a number of admin approvals, sorry for that.
This returns a BapSshHostConfiguration.
For a password connection you can do:
def sshHost = getSSHHost('Configuration Name')
def host = [host: sshHost.hostname, user: sshHost.username, password: sshHost.password]
sshHost = null
sh("""
set +x
sshpass -p "${host.password}" scp -o StrictHostKeyChecking=no ${host.user}#${host.host}:filename.extension .
set -x
""")
This copies the file to your local work directory.
Probably not the best code ever, but I'm not a groovy specialist. It works and that is enough for me. (the set +x is to avoid it echoing the command in the log, showing the password). Getting rid of anything Non-CPS (sshHost = null) before you perform a CPS call saves you a lot of headaches :)
Since it took me quite a while to figure out I wanted to share this for whoever comes next.
Related
I have written scripts for Windows and Linux to essentially set up a new users workspace with all the git repositories from our server.
I would like the user to enter the password for our server once, store it in a local variable, pass that variable to each git pull command, then erase the password variable and exit.
How can I input the password when the git pull command requests it? Both for Windows batch file and a Linux shell script.
Here is code from the Linux script:
#!/bin/bash
echo "Enter password: "
read pswd
clear #No screen peaking
#This is repeated for each repo
location=folderName
mkdir $location
cd $location
git init
git remote add origin git#<server>:$location.git
git pull origin master
#Above prompts for password & is where I want to automatically input $pswd
I've tried various things recommended on SO and elsewhere, such as piping, reading from .txt file, etc. I would prefer to not need anything more than plain old windows cmd and Linux terminal commands. And as this script is just for set up purposes, I do not need to securely store the password permanently with something like ssh agent.
I'm running Windows 7 and Ubuntu 12.10, but this script is meant for setting up new users, so it should ideally work on most distributions.
Synopsis:
git pull "https://<username>:<password>#github.com/<github_account>/<repository_name>.git" <branch_name>
Example:
git pull "https://admin:12345#github.com/Jet/myProject.git" master
Note: This works for me on a bash script
I would really recommend to not try and manage that password step, and delegate that (both on Linux and Windows) to git credential helper.
See:
"Git http - securely remember credentials"
"How to use git with gnome-keyring integration"
The user will enter the password only once per session.
Read the remote url from git and then insert the ID and password (PW) to the url might work.
For example try the following:
cd ${REPOSITORY_DIR}
origin=$(git remote get-url origin)
origin_with_pass=${origin/"//"/"//${USER_ID}:${USER_PW}#"}
git pull ${origin_with_pass} master
I can successfully run a gsutil command with a windows domain account from the command line in Windows (setting up service account key etc.). When I try to run the same command from a SQL Agent Job using a CmdExec task the job hangs and doesn't complete. I can't see any logging so have no clue what it's waiting for. I've setup the job to run with the same Proxy User that i use to run the gsutil command manually.
Any ideas how I can get this to work or how to see more logging?
Are you using standalone gsutil? Or did you get it as part of installing the Cloud SDK (gcloud)?
If the job hangs for a long time, it could be stuck retrying multiple times. To test if this is the case, you can set the num_retries option to be very small, but above 0 (e.g. 1) either in your .boto file or the the command arguments via this option:
gsutil -o 'Boto:num_retries=1' <rest of command here...>
A second thing to note (at least for the version of gsutil that doesn't come with gcloud) is that gsutil looks for your boto config file (which specifies the credentials it should use) in your home directory by default. If you're running gsutil as a different user (maybe your SQL Agent Job runs as its own dedicated user?), it will look for a .boto file in that user's home directory. The same should apply for the gcloud version -- gcloud uses credentials based on the user executing it. You can avoid this by copying your .boto file to somewhere that the job has permission to read from, along with setting the BOTO_CONFIG environment variable to that path before running gsutil. From the cmd shell, this would look something like:
set BOTO_CONFIG=C:\some\path\.boto && gsutil <rest of command here...>
Note: If you're not sure which boto config file you're normally using, you can find out by running gsutil version -l and looking at the line that displays your config path(s).
I need to have additional instance for our production server.
Is it possible?
Where to begin?
Using Postgresql 9.1 on Windows Server
If you already have the binaries, then adding a second instance ("cluster") is done by running initdb and then registering that new instance as a Windows service.
(I will not prefix the name of the executables with the path they are stored in. You need to either add the bin directory of the Postgres installation to your system wide PATH, use fully qualified names, or simply change into the bin directory to make it the current directory)
To do that, open a command line (cmd.exe) and use initdb to create the instance:
initdb -D c:\Data\PostgresInstance2 -W -A md5
-W makes initdb prompt you for the name and password to be used as the superuser of that instance - make sure you remember the username and passwords you have given. -D specifies where the cluster should be created. Do NOT create that under c:\Program Files.
Once the instance (cluster) is initialized edit c:\Data\PostgresInstance2\postgresql.conf to use a different port, e.g. port = 5433. If the instance should be reachable from the outside you also need to adjust listen_addresses.
You can check if everything works by manually starting the new instance:
pg_ctl start -D c:\Data\PostgresInstance2
Once you have change the port (and adjusted other configuration parameters) you can create a Windows service for the new cluster:
pg_ctl register -N postgres2 -D c:\Data\PostgresInstance2
The service will execute with the "Local Network Account", so you have to make sure the privileges on the data directory are setup properly.
#NewSheriff
Your start command for your second server needs to use the port you specified in config
e.g. if using port 5433 instead of port 5432
then adding:
-o "-p 5433"
to the end of your start-up command should get past the error message you mentioned
I would like to create a C program to send a .xml file to a FTP server which I have been given the username and password to periodically. What would be the easiest way to do this?
One idea I had was to just create strings containing the instructions and execute these using system("command") however I have not used FTP before so do not know the correct commands to do so. Is there a better way to go about this? Or if this way is valid what commands would i use to send the file via FTP.
ftp -u ftp://user:passt#ftp.ftpserver.com/local-file.txt local-file.txt
Probably lftp is a better choice here.
lftp ftp://user:password#host -e "put local-file.name; bye"
Rather than a C program, a shell script would be more easier for you to manage and get things done in this case.
Or you can use 'scp' to send file this
$ scp <username>#<ftp-server-ipaddress>:/path/ftp-server/dir <your file>
Or 'sftp', this has an added advantage to surf the ftp server
$ sftp username#<ftp-server-ipaddress>
password:
$ ls
$ pwd
You can use simple linux commands and use the ftp server, incase you need to check your host commands you have to add 'l' as prefix
$lls
$lpwd # working on your machine with sftp login
$ put <your .xml file> # puts your file to ftpserver
$ get <anyfile> # gets files from ftp server
I'm trying to download a file from sftp site using batch script. I'm getting the following error:
Permission denied (publickey,password,keyboard-interactive).
Couldn't read packet: Connection reset by peer
When running the command:
sftp -b /home/batchfile.sftp <user>#<server ip>:<folder>
the batchfile.sftp includes these data:
password
lcd [local folder]
cd [sftp server folder]
get *
bye
Note: It's working when running at the prompt as
sftp <user>#<server ip>:<folder>
But I need the ability to enter the password automatically.
You'll want to install the sshpass program. Then:
sshpass -p YOUR_PASSWORD sftp -oBatchMode=no -b YOUR_COMMAND_FILE_PATH USER#HOST
Obviously, it's better to setup public key authentication. Only use this if that's impossible to do, for whatever reason.
If you are generating a heap of commands to be run, then call that script from a terminal, you can try the following.
sftp login#host < /path/to/command/list
You will then be asked to enter your password (as per normal) however all the commands in the script run after that.
This is clearly not a completely automated option that can be used in a cron job, but it can be used from a terminal.
I advise you to run sftp with -v option. It becomes much easier to fathom what is happening.
The manual clearly states:
The final usage format allows for automated sessions using the -b
option.
In such cases, it is necessary to configure non-interactive authentication
to obviate the need to enter a password at connection time (see
sshd(8) and ssh-keygen(1) for details).
In other words you have to establish a publickey authentication. Then you'll be able to run a batch script.
P.S.
It is wrong to put your password in your batch file.
You mention batch files, am I correct then assuming that you're talking about a Windows system? If so you cannot use sshpass, and you will have to switch to a different option.
Two of such options, that follow diametrically opposite philosophies are:
psftp: command-line tool that you can call from within your batch scripts; psftp is part of the PuTTY package and you can find it here http://www.chiark.greenend.org.uk/~sgtatham/putty/download.html
Syncplify.me FTP Script: a scriptable FTP/S and SFTP client for Windows that allows you to store your password in encrypted "profile files"; check it out here http://www.syncplify.me/products/ftp-script/
Either way, switching from password to PKI authentication is strongly recommended.
PSFTP -b path/file_name.sftp user#IP_server -hostkey 1e:52:b1... -pw password
the file content is:
lcd "path_file for send"
cd path_destination
mput file_name_to_send
quit
to have the hostkey run:
psftp user#IP_SERVER
You need to use the command pscp and forcing it to pass through sftp protocol. pscp is automatically installed when you install PuttY, a software to connect to a linux server through ssh.
When you have your pscp command here is the command line:
pscp -sftp -pw <yourPassword> "<pathToYourFile(s)>" <username>#<serverIP>:<PathInTheServerFromTheHomeDirectory>
These parameters (-sftp and -pw) are only available with pscp and not scp. You can also add -r if you want to upload everything in a folder in a recursive way.
This command will help you
sshpass -p MYPASSWORD sftp MYUSERNAME#HOST