Handle ctrl +d programmatically? - c

I am trying to execute the following perl script.
##some code
$command = "nail -s this is a test $email";
system($command);
##some code
when I run this script, it hangs until I press CtrlD. after pressing CtrlD I get the desired result. My question is how can I hardcode CtrlD in my script?

I suppose you call mailx. nail ist most likely an alias. It expects input from STDIN, which is ended with CtrlD. You could workaround like this to send an empty mail:
$command = 'echo "" | nail -s SUBJECT ' . $email;

The mail program expects an . on a line alone to show it is the end of the message
Just make sure your $email contains a \n. and it should no longer hang.

The usual solution is to redirect it to read from /dev/null
Try to use this :
mail -s "Hello Test" -a Attachment email-address </dev/null
or, if you have any email body
mail -s "Hello Test" -a Attachment email-address <emailbodyfile.txt

Related

Why does quoting password-variable in curl lead to authorization failure? (Bash)

I have a very specific problem with bash and curl.
What we do is:
reading a password from jenkins and paste it to a config-file (i don't have access to the password)
read parameters from config-file in bash (host, user, password, etc.) and store it in variables
post something with curl to a database and store the result in a variable
Recently we added shellcheck to our deploy-scripts and therefore we need to put the variables in quotes.
That's the request we want to send (shellcheck-approved):
result=$(curl -s -XPOST "${dbURL}" --header "Authorization: Basic $(echo -n "${dbUser}:${dbPwd}" | base64)" --data-binary "blabla")
And here's the error message we get in return:
{"error":"authorization failed"}
It does work, when I unquote the password-variable ("${dbUser}":${dbPwd}). But then spellcheck complains, that I need to put all variables in quotes. Also it does work on another machine with different password (which I have no access to either).
It is the same, when I use --user username:password. So it seems like the problem lies within the password.
Using google and testing the procedure (without the curl) with different special characters couldn't solve it either.
Has anyone experienced something like this?
Edit1:
This is an extract from jenkins-deploy-file ..
stage('config files') {
withCredentials([string(credentialsId: "${env_params.db_password}", variable: 'db_pw')]) {
sshagent(credentials: ["${env_params.user}"]) {
sh "echo \"dbPwd=${db_pw}\" >> environment_variables/config.properties"
This is how the shell script stores the password ..
dbPwd=$(grep ^"$dbPwd" <PATH>/config.properties | cut -d "=" -f2)
thanks for your support.
It seems like there are trailing whitespaces in the password-storage.
I removed them using sed and now it works.
dbPwd=$(grep ^"$dbPwd" <PATH>/config.properties | cut -d "=" -f2 | sed -e 's/[[:space:]]*$//')
You can just set the password in another file and use the content of the file as your password variable.

Populate array to ssh in bash

Just some background, I have a file with 1000 servers in it new line delimted. I have to read them to an array the run about 5 commands over SSH. I have been using heredoc notation but that seems to fail. Currently I get an error saying the host isn't recognized.
IFS='\n' read -d '' -r -a my_arr < file
my_arr=()
for i in "${my_arr[#]}"; do
ssh "$1" bash -s << "EOF"
echo "making back up of some file"
cp /path/to/file /path/to/file.bak
exit
EOF
done
I get output that lists the first server but then all the ones in the array as well. I know that I am missing a redirect for STDIN that causes this.
Thanks for the help.
Do you need an array? What is wrong with:
while read -r host
do
ssh "$host" bash -s << "EOF"
echo "making back up of some file"
cp /path/to/file /path/to/file.bak
EOF
done < file
To be clear -- the problem here, and the only problem present in the code actually included in your question, is that you're using $1 inside your loop, whereas you specified $i as the variable that contains the entry being iterated over on each invocation of the loop.
That is to say: ssh "$1" needs to instead by ssh "$i".

Mailx from within a C program?

I'm hoping to send a message via mailx to a list of recipients from within my C code.
I want to send an email containing the contents of the 'message' variable to everyone in the file /home/me/Email_List.txt.
if(send_email)
{
char* message = "Testing email";
//send contents of 'message' to everyone in /home/me/Email_List.txt
}
I need help with both the C program and the mailx command. Here's my mailx command that doesn't quite work:
//This works, but I don't want to send the contents of Email_List.txt
cat /home/me/Email_List.txt /home/me/Email_List.txt | mailx -t -s "Test"
//This doesn't work, error:
//cat: cannot open Test Text
cat /home/me/Email_List.txt "Test Text" | mailx -t -s "Test"
I could write my text to a file before sending it, but that seems inefficient.
Thoughts?
Working at the command line, I got this to work for me (with my normal corporate email address in place of me#example.com, of course):
mailx -s "Just a test" -t <<EOF
To: me#example.com
Subject: Just a test with a subject
Just testing mailx -t which seems to ignore -s options too
-=JL=-
EOF
The -s option subject line was ignored, as hinted in the body text. (This is a mailx version 12.5 6/20/10 on a machine running a derivative of Ubuntu 12.04 LTS.)
Note that the To: line is case-sensitive and space-sensitive (at least there must be a space after the colon). This is standard notation for the headers defined by RFC-822 (or whatever its current incarnation is). When I tried with the -s option but no Subject: line, I got a message without a subject in my inbox.
The following should work for you:
$ cat /home/me/Email_list.txt
To: My.email#company.com
Subject: Test email
$ { cat /home/me/Email_List.txt; echo "Test Text"; } | mailx -t
$
Note that blank line! Or you could use a plain echo; before the echo "Test text";. The semicolons are needed in the { ... } notation.

Nagios bash script returns no output when executed through check_nrpe

My nagios bash script works fine from the client's command line.
When I execute the same script through check_nrpe from the nagios server it returns the following message "CHECK_NRPE: No output returned from daemon."
Seems like a command in the bash script is not being executed.
arrVars=(`/usr/bin/ipmitool sensor | grep "<System sensor>"`)
#echo "Hello World!!"
myOPString=""
<Process array and determine string to echo along with exit code>
echo $myOPString
if [[ $flag == "False" ]]; then
exit 1
else
exit 0
fi
"Hello World" shows up on the nagios monitoring screen if I uncomment the echo statement.
I am new to linux but seems like the nagios user isn't able to execute ipmitool
arrVars=(`/usr/bin/ipmitool sensor | grep "<System sensor>"`)
Check the output of the above, You can echo it and check for the values. If it still does not work use another script to be called by this to get the output and assign it to a variable
exit 1
This refers to the Severity , So you would have to define different conditions where the severity changes
Add this line to the sudoers
nagios ALL=(root) NOPASSWD: /usr/bin/ipmitool
Then use "sudo /usr/bin/ipmitool" in your script

Alternative to Cat out of a bash script?

i have an issue with the following.
exec 3<>/dev/tcp/$1/37491
echo -ne 060e2b3 00$hexdec$cmdhex | perl -pe 's/([0-9a-f]{2})/chr hex $1/gie' >&3
cat <&3
i have a server in which i send a hex string to with the port / tcp connection made. BUT since i am using putty through a terminal my first issue is that the XML response back always says PuTTY and it doesnt escape. I also need to put the replies back sometimes in an array and i have tried
array=`cat <&3`
echo "Array items:"
for item in ${array[*]}
do
printf " %s\n" $item
done
and i believe since the cat is not exiting properly it just stays open the array is not done?
thanks for the advanced help
Do you need specify a timeout?
while read -t 5 line <&3; do
echo "$line"
(( Lines++ ))
done
exec 3>&-
Perhaps you need to close the TCP/IP socket?
exec 3>&-

Resources