ls not working well with nohup - c

I want to list the number of files in a directory in shell-script.
This command works well:
let number_of_files=`ls $direc -l| wc -l`
My problem is that when I use this command with nohup, it doesn't work well.
The same happens when trying to get a file:
file_name=` ls -1 $direc | head -$file_number | tail -1`
Do you know any other option to do it?
I know that in c there is a function:
num_of_files=scandir(directory,&namelist,NULL,NULL);
I also include the full command-line:
nohup sh script_name.sh > log.txt &
Do you know any other way in shell-script that works well with nohup?
Thanks.

Try something like this,
NUMBER_OF_FILES=$(find . -maxdepth 1 -type f | wc -l)
echo $NUMBER_OF_FILES
That is find (from the current directory) to a max depth of 1 (e.g. the current directory only) everything that is of type "file", and then count the number of lines. Finally, assign the result of that to NUMBER_OF_FILES.

Related

How to run a command on all .cs files in directory and store file path as a variable to be used as command on windows

I'm trying to run the following command on each file of a directory.
svn blame FILEPATH | gawk '{print $2}' | sort | uniq -c
It works well however it only works on individual files. For whatever reason, it won't run on the directory as a whole. I was hoping to create some form of batch script that would iterate through the directory and would grab the file path and store it as a variable to be used in the command. However, I've never written a batch script nor do I know the first thing about them. I tried this loop but couldn't get it to work
set codedirectory=%C:\Repo\Pineapple% for %codedirectory% %%i in (*.cs) do
but I'm not necessarily sure what to do next. Unfortunately, this all has to be run on windows. Any help would be greatly appreciated. Thanks!
use for and find, similar to example on
https://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-7.html
for i in $(find . -name "*.cs"); do
svn blame $i | gawk '{print $2}' | sort | uniq -c
done

change folder modified date based on most recent file modified date in folder

I have a number of project folders that all got their date modified set to the current date & time somehow, despite not having touched anything in the folders. I'm looking for a way to use either a batch applet or some other utility that will allow me to drop a folder/folders on it and have their date modified set to the date modified of the most recently modified file in the folder. Can anyone please tell me how I can do this?
In case it matters, I'm on OS X Mavericks 10.9.5. Thanks!
If you start a Terminal, and use stat you can get the modification times of all the files and their corresponding names, separated by a colon as follows:
stat -f "%m:%N" *
Sample Output
1476985161:1.png
1476985168:2.png
1476985178:3.png
1476985188:4.png
...
1476728459:Alpha.png
1476728459:AlphaEdges.png
You can now sort that and take the first line, and remove the timestamp so you have the name of the newest file:
stat -f "%m:%N" *png | sort -rn | head -1 | cut -f2 -d:
Sample Output
result.png
Now, you can put that in a variable, and use touch to set the modification times of all the other files to match its modification time:
newest=$(stat -f "%m:%N" *png | sort -rn | head -1 | cut -f2 -d:)
touch -r "$newest" *
So, if you wanted to be able to do that for any given directory name, you could make a little script in your HOME directory called setMod like this:
#!/bin/bash
# Check that exactly one parameter has been specified - the directory
if [ $# -eq 1 ]; then
# Go to that directory or give up and die
cd "$1" || exit 1
# Get name of newest file
newest=$(stat -f "%m:%N" * | sort -rn | head -1 | cut -f2 -d:)
# Set modification times of all other files to match
touch -r "$newest" *
fi
Then make that executable, just necessary one time, with:
chmod +x $HOME/setMod
Now, you can set the modification times of all files in /tmp/freddyFrog like this:
$HOME/setMod /tmp/freddyFrog
Or, if you prefer, you can call that from Applescript with a:
do shell script "$HOME/setMod " & nameOfDirectory
The nameOfDirectory will need to look Unix-y (like /Users/mark/tmp) rather than Apple-y (like Macintosh HD:Users:mark:tmp).

bash array count always returns 1

I searched all over for this, but the terms are apparently too general. I'm writing a script to search a group of folders for .mp3 files. Some folders don't have mp3's so they have to be excluded.
I created an array to hold the uniq'd folder names. This find command will get the folders I need.
Folders=$(sudo find /my/music/ -type f -name "*.mp3" | cut -d'/' -f7 | sort -u)
When I try to count the number of folders in the array, I always get 1
echo ${#Folders[#]}
echo ${Folders[#]} prints them out on separate lines so I thought they were separate array elements. Can anyone explain what is going on? You might have to jiggle the field number in the cut command to reproduce locally.
Folders is not an array but a variable.
You need:
Folders=( $(sudo find /my/music/ -type f -name "*.mp3" | cut -d'/' -f7 | sort -u) )
i.e. enclose the command substitution with (). Now ${#Folders[#]} would give you the number of elements of array Folders.
Or do :
sudo find /my/music/ -type f -name "*.mp3" | cut -d'/' -f7 | sort -u | wc -l
Note
wc -l prints the number of lines which in this case would be the number of unique files
to make things a bit more explicit, use -printf "%p\n" option with find where %p specifier prints the file with full path.
Assuming bash 4 or later, don't use find here; use the globstar operator.
shopt -s globstar
folders=( /my/music/**/*.mp3 )
Also assuming that cut -d/ -f7 is supposed to extract the filename alone, follow this up with
folders=${folders[#]##*/}
Other methods for populating the array must take more care to accomodate files containing whitespace or characters like ?, *, or [. File names containing newlines (rare, but not illegal) are much more difficult to handle correctly. Pathname expansion is done inside the shell, so you don't need to worry about any such special characters.

passing values of text file to array in shell scripting

My script fetches the names of directories in a path and stores in a text file.
#!/bin/bash
MYDIR="/bamboo/artifacts"
DIRS=`ls -d /bamboo/artifacts/* | cut -d'/' -f4 > plan_list.txt`
plan_list.txt:
**************
PLAN1
PLAN2
PLAN3
Now I am trying to pass each of these directory names to a URL to get output like this.
http://bamboo1.test.com:8080/browse/PLAN1
http://bamboo1.test.com:8080/browse/PLAN2
http://bamboo1.test.com:8080/browse/PLAN3
The script to do that doesn't seem to work
bambooServer="http://bamboo1.test.com:8080/browse/"
for DIR in $DIRS
do
echo `$bambooServer+$DIR`
done
Could someone please tell me what I am missing here? Instead of storing the ls command output to a plan_list.txt file i tried passing to array but that didn't work well too.
DIRS=`ls -d /bamboo/artifacts/* | cut -d'/' -f4 > plan_list.txt`
DIRS is just an empty variable since your command is not producing any output and just redirecting output to plan_list.txt.
You can rewrite your script like this:
#!/bin/bash
mydir="/bamboo/artifacts"
cd "$mydir"
bambooServer="http://bamboo1.test.com:8080/browse/"
for dir in */
do
echo "$bambooServer$dir"
done
*/ is the glob pattern to get all the directories in your current path.

Move files containing X but not containing Y

To manage my backup sync folder, I am trying to come up with a command that would move files beginning with string1* but NOT ending with *string2 from /folder1 to /folder2
What would a command containing such two opposite conditions (HAS and HAS NOT) look like?
#!/bin/bash
for i in `ls -d /folder1/string1* | grep -v 'string2$'`
do
ls -ld $i | grep '^-' > /dev/null # Test that we have a regular file and not a directory etc.
if [ $? == 0 ]; then
mv $i /folder2
fi
done
Try something like
find /folder1 -mindepth 1 -maxdepth 1 -type f \
-name 'string1*' \! -name '*string2' -exec cp -iv {} /folder2 +
Note: If your have a older version of find you can replace + with \;
To me this is another case for (what I shall denote) the read while pattern.
cd /folder1
ls string1* | grep -v 'string2$' | while read f; do mv $f /folder2; done
The other answers are good alternatives, and in particular, find can do a lot. But I always get a headache using find, and never quite use it enough to do so without the manpage open.
Also, starting with ls or a simple find to get a list of files, and then using any or all of sed, awk, grep or whatever you have to hand, to adjust/trim/extend this list, and then bunging it into a loop, is a crude(ish) but pretty powerful technique.

Resources