How to create ctags only for c and h files in code - c

I have a compiled code. If i try generation the tags using the 'usr/bin/ctags -R *', it will include all the c,h,object files etc. So it is taking lot of time and also memory. Could you please let me know how to generate tags only for c/h files.

Use the --languages parameter:
ctags --languages=C

You can use a .ctagsignore file to exclude any unwanted files from the generated tags.
The following is an example I use for my projects:
Contents of .ctagsignore:
bin
makefile
obj
tags
workspace.vim
And then generate tags with:
-ctags -R --exclude=#.ctagsignore
This ignores every file in the bin, and obj folder, the makefile, the (future) generated tags file, and the workspace.vim file in the project directory. It essentially works like a .gitignore file, so you can use the wildcard (*.o) notation - myfolder/*.o ignores all object files in $PROJ_DIR$/myfolder/

i would try 'man ctags' and read thru the options.
or you can use shell magic like:
ctags `find . -name "*.[ch]" -print`
or something like that

Related

IBM i PASE tar - Excluding files or directories

I want to exclude some directories from an archive using the PASE tar command on an IBMi but the [-X Exclude File] option doesn't seems to work for me.
I tried using an exclude file that just contained a file name (/home/JSMITH/data/sub2/file2.txt) and then one that just contained a pattern (*.txt), and neither archive operation omitted anything.
Given the following directory structure:
/home/JSMITH/data
/home/JSMITH/data/sub1
/home/JSMITH/data/sub1/file1.txt
/home/JSMITH/data/sub2
/home/JSMITH/data/sub2/file2.txt
/home/JSMITH/data/sub3
/home/JSMITH/data/sub3/file3.txt
and the following command:
/qopensys/usr/bin/tar -cvf /home/JSMITH/test.tar -X /home/JSMITH/excludes.txt /home/JSMITH/data
The entire /home/JSMITH/data structure gets included in the resulting archive.
I have tried using the /home/JSMITH/excludes.txt file with either of these contents:
/home/JSMITH/data/sub2/file2.txt
or
*.txt
How does one exclude files/directories/patterns from the IBMi PASE tar command?
You need the full path in the exclude file.
I created mine via ls /home/JSMITH/data/*.txt > /home/JSMITH/excludes.txt
If you're doing it by hand, make certain you haven't got any trailing whitespace.
Also, I used Notepad++ when I created mine by hand. I found that the green screen edtf created an EBCDIC file with CRLF in it, and that didn't exclude for me.
IBM i 7.1

exuberant ctags with C not yielding redirection to function definition properly

I was using ctags with C on an old project, mostly coding in Vim. ctags was working fine[redirecting perfectly]
Now the code base has changed a lot, I ran command :
ctags -R, now when I do Ctrl + ], I am not taken to the function/macro definition as earlier it used to do, at times it takes me to the correct line but mostly it takes me to some lines above/below the actual definition. Can anybody please help?
It sounds like you have a tag file for each directory in your project. ctags -R only changes the tag file in the current directory It does not change all the tags files recursively. This is why when you do <C+]> you jump to a place sort of near the actual definition.
To fix this you should only have one tags file and it should be located at the root of you project. Then in your vimrc add set tags=./tags;/. This allows vim to search for a tags file starting from you current directory up the tree until it finds one. (stopping at the root directory)

How to achieve symbol referencing across directories in vim?

Can ctags tag symbols from a directory up in the hierarchy also or is it limited to create tags for current and sub-directories only?
Basically I'm looking for Visual Studio like symbol cross referencing it is very helpful in understanding alien source code flow.
If not Vim, then which other editor should I use?
thanks
Ctags only recurses to subdirectories. But all you have to do is run ctags -R . in your project home directory, and it will create a tags file for your whole project.
You aren't limited to specifying one tags file in Vim. This is an alternative to the other answers; you can just do something like:
set tags=tags,~/wintags,c:/path/to/moretags/etc
So you don't need to take the time regenerating a monolithic tags file when you just want to update your local tags.
Regarding the OP's comment in another answer,
yes thats correct but when i open a file say proj/dir1/def.c and press ctrl+] on a function name which is defined say in proj/dir2/abc.c, I get tag not found :(
You could also create one tags file for all of your projects at the 'proj' root:
set tags=tags;c:/path/to/proj
This will use the first file named tags that it finds as it walks up the directory hierarchy from where you are.
You can combine these two techniques to have a project-local tags file and then a "global" tags file that isn't updated as often.
Whilst it's got similar user interface for asking it to do it's thing, so you need to actually specify "go down directories", I find that cscope is a very nice tool, whcih does everything that ctags does and a bit more.
ctags (well, exctags at least) can create tags for as many directory trees you want. Simply run
exctags -R dir1 dir2 ...
Then vim knows about all the symbols you need. For example, one of the directories could be /usr/include in addition to your own source directory.
Make sure to run vim path/to/file.c from the same directory you created the tags file in.

Creating makefile for C code, running without ./

I need to create a makefile that will compile my simpleprogram.c to sp and it can be called like unix commands like ls,ps etc, without writing explicitly ./sp. I looked upon the web and cannot find a solution, or searching it in a wrong way. I cannot search like "executable without ./" , because I do not know what is this called => "./"
Put the binary in a directory that's in your PATH.
http://www.cs.purdue.edu/homes/cs348/unix_path.html
Just copy your program to your systems bin (executable binaries) directory.
Most commonly its /usr/bin for programs which can be used by all user.
If the app is only for admins, you should use /usr/sbin/ directory.
Remember to set the "executable" flag with chmod: chmod +x your_app
The proper solution to this (assuming you don't want sp to be run from outside of your makefile) is to call your program using the full path name instead of ./ (which is relative, and can change during multi-directory makes). In your makefile do something like:
SP_DIR := $(shell pwd)/spdir
rule : somedependency
$(SP_DIR)/sp
Where $(shell pwd) will expand to the directory the makefile is being run from. If your sp directory is in a parent directory of this, it is possible to use .. in the path as well: eg.
SP_DIR := $(shell pwd)/../../spdir
If you do want to run sp from outside of the makefile, then you need to either copy sp to a directory specified in your PATH variable (do echo $PATH to see these), or modify your .bashrc or equivalent file to make PATH include the directory that sp is built in.
John
You can just do:
export PATH=$PATH:.
But this is not a good idea, in general.

Cat selected files fast and easy?

I have been cat'ing files in the Terminal untill now.. but that is time consuming when done alot. What I want is something like:
I have a folder with hundreds of files, and I want to effectively cat a few files together.
For example, is there a way to select (in the Finder) five split files;
file.txt.001, file.txt.002, file.txt.003, file.txt.004
.. and then right click on them in the Finder, and just click Merge?
I know that isn't possible out of the box of course, but with an Automator action, droplet or shell script, is something like that possible to do? Or maybe assigning that cat-action a keyboard shortcut, and when hit selected files in the Finder, will be automatically merged together to a new file AND placed in the same folder, WITH a name based on the original split files?
In this example file.001 through file.004 would magically appear in the same folder, as a file named fileMerged.txt ?
I have like a million of these kind of split files, so an efficient workflow for this would be a life saver. I'm working on an interactive book, and the publisher gave me this task..
cat * > output.file
works as a sh script. It's piping the contents of the files into that output.file.
* expands to all files in the directory.
Judging from your description of the file names you can automate that very easily with bash. e.g.
PREFIXES=`ls -1 | grep -o "^.*\." | uniq`
for PREFIX in $PREFIXES; do cat ${PREFIX}* > ${PREFIX}.all; done
This will merge all files in one directory that share the same prefix.
ls -1 lists all files in a directory (if it spans multiple directories can use find instead. grep -o "^.*\." will match everything up to the last dot in the file name (you could also use sed -e 's/.[0-9]*$/./' to remove the last digits. uniq will filter all duplicates. Then you have something like speech1.txt. sound1.txt. in the PREFIXES variable. The next line loops through those and merges the groups of files individually using the * wildcard.

Resources