Change namespace to project - qooxdoo

when I created my project I typed its name wrong and I started to programming because still I'm learning, now I have many classes in many files and growing and I'd like to rename it, it is possible ? let's say from : ame_testqooxdooini1 to wrs
Thanks a lot

close your editor, open a shell and then do this:
OLD=badname
NEW=goodname
cd source/class
mv $OLD $NEW
find $NEW -name '*.js' -print0 | \
xargs -0 perl -i~ -p -e "s/${OLD}\./${NEW}./g"

Related

Improved find command to list files, their dir and size

I working on a cmd-line that I execute with plink from PowerShell (PowerCLI) on ESXi.
The idea is to list vmdk files (with exceptions), with their symlink (because their real folders names are IDs) and first subfolder (that'd help me finding VMDK file as it may reflect VM folder). Output is CSV format so I can easily use it in PowerShell. This is where I came so far:
find /vmfs/volumes -type l -exec find {} -name "*.vmdk" -follow \; | awk '{n=split($0,a,"/"); print a[4]";"a[5]";"a[n] }' | grep -v ".*-flat.vmdk$" | grep -v ".*delta.vmdk$" | grep -v ".*-ctk.vmdk$"
This is good for me, but I'd like to add file size as last field (VMDKFileName;Size). Size format does not really matter, I'll be able to manipulate it within my PS script.
Idk if I'm on the right way to fulfill my needs.
Do not hesitate to ask for more informations.
P.S: a one-liner command would be great as I'm using PLink, it's easier for me to use.
TIA
Ok, anwser is here (lots of headaches) !
find $(find /vmfs/volumes -type l -maxdepth 1) -name "*.vmdk" -follow -exec ls -lHd {} \; | awk '{n=split($0,a,"/"); print a[4]";"a[5]";"a[n]";"$5}' | grep -v ".*-flat.vmdk" | grep -v ".*delta.vmdk" | grep -v ".*-ctk.vmdk"

Run an awk script on every file of a certain type in a directory

I have a directory with several hundred .log files in it, and I have a script to pull some info out of them and print it to an existing file. Running it on one file goes like
awk -f HLGcheck.sh 1-1-1.log >> outputs.txt
and this works fine. I've looked around for several hours online and I can't seem to find a decent way to have it run on all .log files in the directory. Any help from people smarter than me would be appreciated.
Some techniques:
If the awk script can only handle one file at a time, use a for loop as shown or
find . -name '*.log' -exec awk -f HLGcheck.sh '{}' \; >> outputs.txt
If the awk script can handle multiple files:
awk -f HLGcheck.sh *.log >> outputs.txt
find . -name '*.log' -exec awk -f HLGcheck.sh '{}' \+ >> outputs.txt
bash has for loop for this purpose
$ for f in *.log; do your_processing_here; done
you can refer to the file in processing as $f

Copy SRC directory with adding a prefix to all c-library functions

I have a embedded C static library that communicates with a hardware peripheral. It currently does not support multiple hardware peripherals, but I need to interface to a second one. I do not care about code footprint rightnow. So I want to duplicate that library; one for each hardware.
This of course, will result in symbol collision. A good method is to use objcopy to add a prefix to object files. So I can get hw1_fun1.o and hw2_fun1.o. This post illustrates it.
I want to add a prefix to all c functions on the source level, not the object. Because I will need to modify a little bit for hw2.
Is there any script, c-preprocessor, tool that can make something like:
./scriptme --prefix=hw2 ~/src/ ~/dest/
I'll be grateful :)
I wrote a simple bash script that does the required function, or sort of. I hope it help someone one day.
#!/bin/sh
DIR_BIN=bin/ext/lwIP/
DIR_SRC=src/ext/lwIP/
DIR_DST=src/hw2_lwIP/
CMD_NM=mb-nm
[ -d $DIR_DST ] || ( echo "Destination directory does not exist!" && exit 1 );
cp -r $DIR_SRC/* $DIR_DST/
chmod -R 755 $DIR_DST # cygwin issue with Windows7
sync # file permissions. (Pure MS shit!)
funs=`find $DIR_BIN -name *.o | xargs $CMD_NM | grep " R \| T " | awk '{print $3}'`
echo "Found $(echo $funs | wc -w) functions, processing:"
for fun in $funs;
do
echo " $fun";
find $DIR_DST -type f -exec sed -i "s/$fun/hw2_$fun/g" {} \;
done;
echo "Done! Now change includes and compile your project ;-)"

Move files containing X but not containing Y

To manage my backup sync folder, I am trying to come up with a command that would move files beginning with string1* but NOT ending with *string2 from /folder1 to /folder2
What would a command containing such two opposite conditions (HAS and HAS NOT) look like?
#!/bin/bash
for i in `ls -d /folder1/string1* | grep -v 'string2$'`
do
ls -ld $i | grep '^-' > /dev/null # Test that we have a regular file and not a directory etc.
if [ $? == 0 ]; then
mv $i /folder2
fi
done
Try something like
find /folder1 -mindepth 1 -maxdepth 1 -type f \
-name 'string1*' \! -name '*string2' -exec cp -iv {} /folder2 +
Note: If your have a older version of find you can replace + with \;
To me this is another case for (what I shall denote) the read while pattern.
cd /folder1
ls string1* | grep -v 'string2$' | while read f; do mv $f /folder2; done
The other answers are good alternatives, and in particular, find can do a lot. But I always get a headache using find, and never quite use it enough to do so without the manpage open.
Also, starting with ls or a simple find to get a list of files, and then using any or all of sed, awk, grep or whatever you have to hand, to adjust/trim/extend this list, and then bunging it into a loop, is a crude(ish) but pretty powerful technique.

Append some text to the end of multiple files in Linux

How can I append the following code to the end of numerous php files in a directory and its sub directory:
</div>
<div id="preloader" style="display:none;position: absolute;top: 90px;margin-left: 265px;">
<img src="ajax-loader.gif"/>
</div>
I have tried with:
echo "my text" >> *.php
But the terminal displays the error:
bash : *.php: ambiguous redirect
I usually use tee because I think it looks a little cleaner and it generally fits on one line.
echo "my text" | tee -a *.php
You don't specify the shell, you could try the foreach command. Under tcsh (and I'm sure a very similar version is available for bash) you can say something like interactively:
foreach i (*.php)
foreach> echo "my text" >> $i
foreach> end
$i will take on the name of each file each time through the loop.
As always, when doing operations on a large number of files, it's probably a good idea to test them in a small directory with sample files to make sure it works as expected.
Oops .. bash in error message (I'll tag your question with it). The equivalent loop would be
for i in *.php
do
echo "my text" >> $i
done
If you want to cover multiple directories below the one where you are you can specify
*/*.php
rather than *.php
BashFAQ/056 does a decent job of explaining why what you tried doesn't work. Have a look.
Since you're using bash (according to your error), the for command is your friend.
for filename in *.php; do
echo "text" >> "$filename"
done
If you'd like to pull "text" from a file, you could instead do this:
for filename in *.php; do
cat /path/to/sourcefile >> "$filename"
done
Now ... you might have files in subdirectories. If so, you could use the find command to find and process them:
find . -name "*.php" -type f -exec sh -c "cat /path/to/sourcefile >> {}" \;
The find command identifies what files using conditions like -name and -type, then the -exec command runs basically the same thing I showed you in the previous "for" loop. The final \; indicates to find that this is the end of arguments to the -exec option.
You can man find for lots more details about this.
The find command is portable and is generally recommended for this kind of activity especially if you want your solution to be portable to other systems. But since you're currently using bash, you may also be able to handle subdirectories using bash's globstar option:
shopt -s globstar
for filename in **/*.php; do
cat /path/to/sourcefile >> "$filename"
done
You can man bash and search for "globstar" for more details about this. This option requires bash version 4 or higher.
NOTE: You may have other problems with what you're doing. PHP scripts don't need to end with a ?>, so you might be adding HTML that the script will try to interpret as PHP code.
You can use sed combined with find. Assume your project tree is
/MyProject/
/MyProject/Page1/file.php
/MyProject/Page2/file.php
etc.
Save the code you want to append on /MyProject/. Call it append.txt
From /MyProject/ run:
find . -name "*.php" -print | xargs sed -i '$r append.txt'
Explain:
find does as it is, it looks for all .php, including subdirectories
xargs will pass (i.e. run) sed for all .php that have just been found
sed will do the appending. '$r append.txt' means go to the end of the file ($) and write (paste) whatever is in append.txt there. Don't forget -i otherwise it will just print out the appended file and not save it.
Source: http://www.grymoire.com/unix/Sed.html#uh-37
You can do (Work even if there's space in your file path) :
#!/bin/bash
# Create a tempory file named /tmp/end_of_my_php.txt
cat << EOF > /tmp/end_of_my_php.txt
</div>
<div id="preloader" style="display:none;position: absolute;top: 90px;margin-left: 265px;">
<img src="ajax-loader.gif"/>
</div>
EOF
find . -type f -name "*.php" | while read the_file
do
echo "Processing $the_file"
#cp "$the_file" "${the_file}.bak" # Uncomment if you want to save a backup of your file
cat /tmp/end_of_my_php.txt >> "$the_file"
done
echo
echo done
PS: You must run the script from the directory you want to browse
Inspired from #Dantastic answer :
echo "my text" | tee -a file1.txt | tee -a file2.txt

Resources