I have a file which contains commands similar to:
cat /home/ptay89/test/01.out
cat /home/ptay89/testing/02.out
...
But I only want a few of them executing. For example, if I only want to see the output files ending in 1.out, I can do this:
cat commands | grep 1.out | sh
However, I get the following output for each of the lines in the commands file:
: cannot be loaded - no such file or directoryst/01.out
When I copy and past the commands I want from the file directly, it works fine. Are there better ways of doing this?
You probably have spurious carriage returns in your file (created under Windows?). Use tr instead of cat to remove them:
tr -d '\015' <commands | grep 1.out | sh
Try doing a
grep -e '^cat.*out' commands | grep 1.out | sh
That should ignore any weird characters and take only the ones you need.
Related
I'm trying to run the following command on each file of a directory.
svn blame FILEPATH | gawk '{print $2}' | sort | uniq -c
It works well however it only works on individual files. For whatever reason, it won't run on the directory as a whole. I was hoping to create some form of batch script that would iterate through the directory and would grab the file path and store it as a variable to be used in the command. However, I've never written a batch script nor do I know the first thing about them. I tried this loop but couldn't get it to work
set codedirectory=%C:\Repo\Pineapple% for %codedirectory% %%i in (*.cs) do
but I'm not necessarily sure what to do next. Unfortunately, this all has to be run on windows. Any help would be greatly appreciated. Thanks!
use for and find, similar to example on
https://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-7.html
for i in $(find . -name "*.cs"); do
svn blame $i | gawk '{print $2}' | sort | uniq -c
done
i tried to make bash script that can find "keyword" inside *.desktop file. my approach is to set some keyword as array, then pass it to grep, it work flawlessly until the keyword has at least two word separated by space.
what it should be
cat /usr/share/applications/*.desktop | grep -i "Mail Reader"
what i have tried
search=$(printf 'Name=%s' "${appsx[$index]}")
echo \""$search\"" #debug
cat /usr/share/applications/*.desktop | grep -i $search
search=$(printf 'Name=%s' "${appsx[$index]}")
echo \""$search\"" #debug
cat /usr/share/applications/*.desktop | grep -i \""$search\""
search=$(printf '"Name=%s"' "${appsx[$index]}")
echo $search #debug
cat /usr/share/applications/*.desktop | grep -i $search
any suggestions is highly appreciated
If you simply assign Mail Reader to the variable search like below
search=Mail Reader
bash would complain that Reader command is not found as it takes anything after that first blank character to be a subsequent command. What you need is
search="Mail Reader" # 'Mail Reader' would also do.
In the case of your command substitution, things are not different, you need double quote wrappers though, as the substitution itself would not happen inside the single
quotes
search="$(command)"
In your case, you did an overkill using a command substitution though. It could be well simplified to:
search="Name=${appsx[$index]}"
# Then do the grep.
# Note that cat-grep combo could be simplified to
# -h suppresses printing filenames to get same result as cat .. | grep
grep -ih "$search" /usr/share/applications/*.desktop
My script fetches the names of directories in a path and stores in a text file.
#!/bin/bash
MYDIR="/bamboo/artifacts"
DIRS=`ls -d /bamboo/artifacts/* | cut -d'/' -f4 > plan_list.txt`
plan_list.txt:
**************
PLAN1
PLAN2
PLAN3
Now I am trying to pass each of these directory names to a URL to get output like this.
http://bamboo1.test.com:8080/browse/PLAN1
http://bamboo1.test.com:8080/browse/PLAN2
http://bamboo1.test.com:8080/browse/PLAN3
The script to do that doesn't seem to work
bambooServer="http://bamboo1.test.com:8080/browse/"
for DIR in $DIRS
do
echo `$bambooServer+$DIR`
done
Could someone please tell me what I am missing here? Instead of storing the ls command output to a plan_list.txt file i tried passing to array but that didn't work well too.
DIRS=`ls -d /bamboo/artifacts/* | cut -d'/' -f4 > plan_list.txt`
DIRS is just an empty variable since your command is not producing any output and just redirecting output to plan_list.txt.
You can rewrite your script like this:
#!/bin/bash
mydir="/bamboo/artifacts"
cd "$mydir"
bambooServer="http://bamboo1.test.com:8080/browse/"
for dir in */
do
echo "$bambooServer$dir"
done
*/ is the glob pattern to get all the directories in your current path.
I am trying to use xmllint to search an xml file and store the values I need into an array. Here is what I am doing:
#!/bin/sh
function getProfilePaths {
unset profilePaths
unset profilePathsArr
profilePaths=$(echo 'cat //profiles/profile/#path' | xmllint --shell file.xml | grep '=' | grep -v ">" | cut -f 2 -d "=" | tr -d \")
profilePathsArr+=( $(echo $profilePaths))
return 0
}
In another function I have:
function useProfilePaths {
getProfilePaths
for i in ${profilePathsArr[#]}; do
echo $i
done
return 0
}
useProfilePaths
The behavior of the function changes whether I do the commands manually on the command line VS calling them from different function as part of a wrapper script. When I can my function from a wrapper script, the items in the array are 1, compared to when I do it from the command line, it's 2:
$ echo ${#profilePathsArr[#]}
2
The content of profilePaths looks like this when echoed:
$ echo ${profilePaths}
/Profile/Path/1 /Profile/Path/2
I am not sure what the separator is for an xmllint call.
When I call my function from my wrapper script, the content of the first iteration of the for loop looks like this:
for i in ${profilePathsArr[#]}; do
echo $i
done
the first echo looks like:
/Profile/Path/1
/Profile/Path/2
... and the second echo is empty.
Can anyone help me debug this issue? If I could find out what is the separator used by xmllint, maybe I could parse the items correctly in the array.
FYI, I have already tried the following approach, with the same result:
profilePaths=($(echo 'cat //profiles/profile/#path' | xmllint --shell file.xml | grep '=' | grep -v ">" | cut -f 2 -d "=" | tr -d \"))
Instead of using the --shell switch and many pipes, you should use the proper --xpath switch.
But as far of I know, when you have multiple values, there's no simple way to split the different nodes.
So a solution is to iterate like this :
profilePaths=(
$(
for i in {1..100}; do
xmllint --xpath "//profiles[$i]/profile/#path" file.xml || break
done
)
)
or use xmlstarlet:
profilePaths=( $(xmlstarlet sel -t -v "//profiles/profile/#path" file.xml) )
it display output with newlines by default
The problem you're having is related to data encapsulation; specifically, variables defined in a function are local, so you can't access them outside that function unless you define them otherwise.
Depending on the implementation of sh you're using, you may be able get around this by using eval on your variable definition or with a modifier like global for mksh and declare -g for zsh and bash. I know that mksh's implementation definitely works.
Thank you for providing feedback on how I can resolve this problem. After investigating more, I was able to make this work by changing the way I was iterating the content of my 'profilePaths' variable to insert its values into the 'profilePathsArr' array:
# Retrieve the profile paths from file.xml and assign to 'profilePaths'
profilePaths=$(echo 'cat //profiles/profile/#path' | xmllint --shell file.xml | grep '=' | grep -v ">" | cut -f 2 -d "=" | tr -d \")
# Insert them into the array 'profilePathsArr'
IFS=$'\n' read -rd '' -a profilePathsArr <<<"$profilePaths"
For some reason, with all the different function calls from my master script and calls to other scripts, it seemed like the separators were lost along the way. I am unable to find the root cause, but I know that by using "\n" as the IFS and a while loop, it worked like a charm.
If anybody wishes to add more comments on this, you are more than welcome.
I'm using the command grep 3 times on the same line like this
ls -1F ./ | grep / | grep -v 0_*.* | grep -v undesired_result
is there a way to combine them into one command instead of having it to pipe it 3 times?
There's no way to do both a positive search (grep <something>) and a negative search (grep -v <something>) in one command line, but if your grep supports -E (alternatively, egrep), you could do ls -1F ./ | grep / | grep -E -v '0_*.*|undesired_result' to reduce the sub-process count by one. To go beyond that, you'd have to come up with a specific regular expression that matches either exactly what you want or everything you don't want.
Actually, I guess that first sentence isn't entirely true if you have egrep, but building the proper regular expression that correctly includes both the positive and negative parts and covers all possible orderings of the parts might be more frustrating than it's worth...