Looping through all the inputs and creating distinct output files - file

I am using cygwin on Windows 7. I have a folder with about 1200 files of the same type(There are no sub-directories). I am trying to go through the folder and perform a certain action(it's bioinformatic alignment) on each file. Here is the code that I am using:
$ for file in Project/genomes_0208/*;
do ./bowtie-build $file ../bowtie-0.12.7/indexes/abc;
done
./bowtie - build is the operation that I want to perform. Now, this does complete the operation for all the files in the folder,but it keeps writing the ouput in the same file abc in this case.
So in the end I have only 1 file with the latest output. How can I create 1200 different files , one for each of the input? It doesn't matter what I name the output files, it could be anything, as long as they are obviously different.
Hope I explained the problem successfully, I'd appreciate any help with this!

How about:
./bowtie-build $file "${file}.out"
If your files had unique names to begin with this should also produce unique output files.

Related

Rename a lot of files in subdirectories with multiple file extenisons

So I have a very big folder full of more folders which hold files that all have their regular extension, but then with ,v after it (like .xml,v)
Is there a quick way/program to make it go through all of the folders and when it finds a ,v it'll remove the ,v from it?
Thanks
EDIT: I am running Windows 7 (64-bit). Also please remember than I am an idiot :P
Use find to list the files ending ,v. Pipe the output to a shell loop that renames the files.
${f%%,v} matches the file name without the ,v suffix.
find . -name \*,v | while read f; do mv $f ${f%%,v} ;done
Not clear, Where you have the files? (In your computer / on a server).
What is the platform (Windows / Linux) ...
There are multiple ways to solve it based on scenario (like a tiny batch file can do it in a flash if the folder is in your local computer with windows platform) ...

Copying multiple files to a Winscp directory via Script

I have a problem when trying to upload multiple files to one WinSCP directory, i can manage to copy just one single file, but the problem is that i need to upload many files that are generated by a software, the names are not fixed ones, so i need to make use of wildcards in roder to copy all of them, i have tried many variants on the code, but it all was unsuccessful, the code i am using is:
open "sftp://myserver:MyPass#sfts.us.myserver.com" -hostkey="hostkey"
put "C:\from*.*" "/Myserverfolder/Subfolder/"
exit
This code does actually copy the first alphabetically named file, but it ignores the rest of the files.
Any help with it would be much appreciated
Try this in script
Lcd C:\from
Cd Myserverfolder/Subfolder
Put *
Try and do all manually first so you can see what's going on.

Perl Script not saving C outfile

I've been adding a plugin to an existing project, and the thing is tied together with a perl script. I'm trying to add my C program into the perl script to make an output file, but the output is garbage or missing.
My executable is called Interpolate and when it's in the same folder as the perl script it's working just fine
./Interpolate inv.tracking_log
Is how the command is run. It should produce an intermediate filecalled tmp.log, and a final file called out.txt. When I run it in the directory it does just fine, both files are as they should be.
So then I added a system call into the perl script (I barely (if that) know perl):
print("./Interpolate $inVideoFile"); //prints like the command (just a test)
my $interCall = system("./Interpolate $inVideoFile");
When running it from within the perl script, the tmp.log file is mostly garbage, and out.txt is missing entirely. I do realize out is most likely missing because it has a dependency on the tmp.log file. Is there a perl 'gotchya' that I'm missing somewhere?
system("./Interpolate $inVideoFile");
should be
system("./Interpolate", $inVideoFile);
If you still have a problem after fixing that, $inVideoFile doesn't contain what it should, or there's a bug in your C program. (What is the return value of the system call?)

Cat selected files fast and easy?

I have been cat'ing files in the Terminal untill now.. but that is time consuming when done alot. What I want is something like:
I have a folder with hundreds of files, and I want to effectively cat a few files together.
For example, is there a way to select (in the Finder) five split files;
file.txt.001, file.txt.002, file.txt.003, file.txt.004
.. and then right click on them in the Finder, and just click Merge?
I know that isn't possible out of the box of course, but with an Automator action, droplet or shell script, is something like that possible to do? Or maybe assigning that cat-action a keyboard shortcut, and when hit selected files in the Finder, will be automatically merged together to a new file AND placed in the same folder, WITH a name based on the original split files?
In this example file.001 through file.004 would magically appear in the same folder, as a file named fileMerged.txt ?
I have like a million of these kind of split files, so an efficient workflow for this would be a life saver. I'm working on an interactive book, and the publisher gave me this task..
cat * > output.file
works as a sh script. It's piping the contents of the files into that output.file.
* expands to all files in the directory.
Judging from your description of the file names you can automate that very easily with bash. e.g.
PREFIXES=`ls -1 | grep -o "^.*\." | uniq`
for PREFIX in $PREFIXES; do cat ${PREFIX}* > ${PREFIX}.all; done
This will merge all files in one directory that share the same prefix.
ls -1 lists all files in a directory (if it spans multiple directories can use find instead. grep -o "^.*\." will match everything up to the last dot in the file name (you could also use sed -e 's/.[0-9]*$/./' to remove the last digits. uniq will filter all duplicates. Then you have something like speech1.txt. sound1.txt. in the PREFIXES variable. The next line loops through those and merges the groups of files individually using the * wildcard.

Combining files in Notepad++

Is there a Notepad++ plugin out there that automatically combines all currently opened files into a single file?
Update: Yes, I am very aware of copy and paste :) I'm working with lots of files, and I want a solution that makes this step in the process a little bit faster than several dozen copy and pastes.
I'm aware of utilities for combining files, but I want the convenience of combining specifically the files that are currently opened in my text editor.
If there isn't a plugin out there already, I'll write one myself; I was just wondering if it exists already to save me the time of developing one.
I used the following command in DOS prompt to do the merge for me:
for %f in (*.txt) do type "%f" >> output.txt
It is fast and it works. Just ensure that all the files to be merge are in the same directory from where you execute this command.
http://www.scout-soft.com/combine/
Not my app, but this plug in lets you combine all open tabs into one file.
I installed the Python Script plugin and wrote a simple script:
console.show()
console.clear()
files = notepad.getFiles()
notepad.new()
newfile = notepad.getCurrentFilename()
for i in range(len(files) - 1):
console.write("Copying text from %s\n" % files[i][0])
notepad.activateFile(files[i][0])
text = editor.getText()
notepad.activateFile(newfile)
editor.appendText(text)
editor.appendText("\n")
console.write("Combine Files operation complete.")
It looks at all the files currently opened in Notepad++ and adds them to a new file. Does exactly what I need.

Resources