I've got a known, predetermined set of calls to a function
FUNC_A("ABCD");
FUNC_A("EFGH");
And what I was hoping to do was something like
#define FUNC_A("ABCD") 0
#define FUNC_A("EFGH") 1
#define FUNC_A(X) 0xFF
So that the whole thing gets replaced by the integer before compiling and I can switch off the value and not have to store the string or do the comparaison at run-time.
I realize that we can't do this in the preprocessor but was just wondering if anyone has come across some nifty way of getting around this seemingly solveable problem.
You may handcraft your comparison if you need that, but this will be tedious. For simplicity let us suppose that we want to do it for the string "AB":
#define testAB(X) ((X) && (X)[0] == 'A' && (X)[1] == 'B' && !(X)[2])
this will return 1 when the string is equal to "AB" and 0 otherwise, and also take care that the string is of the correct length, not access beyond array bounds etc.
The only thing that you'd have to worry, is that the argument X is evaluated multiple times. This isn't a problem if you pass in a string literal, but would be for expressions with side effects.
For string literals any decent compiler should be able to replace such an expression at compile time.
For doing as you describe, avoiding strings and run-time comparisons, I can only think of a pre-preprocessor. Would it be for just a quick hacking around, in a Unix environment I'd try a simple wrapper for the preprocessor using a bash script that in turn uses sed or awk to replace the functions and arguments mentioned and then calling the real cpp preprocessor. I'd consider this just as a quick hack.
Update: In linux and gcc, it seems easier to do a post-preprocessor, because we can replace the generated .i file (but we can't generally do that with the original .c file). For doing that, we can make a cc1 wrapper.
Warning: this is another dangerous and ugly hack. Also see Custom gcc preprocessor
This is a cc1 wrapper for doing that. It's a bash script for linux and gcc 4.6:
#!/bin/bash
# cc1 that does post preprocessing on generated .i files, replacing function calls
#
# note: doing post preprocessing is easier than pre preprocessing, because in post preprocessing we can replace the temporary .i file generated by the preprocessor (in case of doing pre preprocessing, we should change the original .c file -this is unacceptable-; or generate a new temp .c file with our preprocessing before calling the real preprocessor, but then eventual error messages are now referring to the temp .c file..)
convert ()
{
local i=$1
local o=$2
ascript=$(cat <<- 'EOAWK'
{
FUNCT=$1;
ARGS=$2;
RESULT=$3;
printf "s/%s[ \\t]*([ \\t]*%s[ \\t]*)/%s/g\n", FUNCT, ARGS, RESULT;
}
EOAWK
)
seds=$(awk -F '|' -- "$ascript" << EOFUNCS
FUNC_A|"ABCD"|0
FUNC_A|"EFGH"|1
FUNC_A|X|0xFF
EOFUNCS
)
sedfile=$(mktemp --tmpdir prepro.sed.XXX)
echo -n "$seds" > "$sedfile"
sed -f "$sedfile" "$i" > "$o"
rc=$?
rm "$sedfile"
return $rc
}
for a
do
if [[ $a = -E ]]
then
isprepro=1
elif [[ $isprepro && $a = -o ]]
then
getfile=1
elif [[ $isprepro && $getfile && $a =~ ^[^-].*[.]i ]]
then
ifile=$a
break
fi
done
#echo "args:$#"
#echo "getfile=$getfile"
#echo "ifile=$ifile"
realcc1=/usr/lib/gcc/i686-linux-gnu/4.6/cc1
$realcc1 "$#"
rc=$?
if [[ $rc -eq 0 && $isprepro && $ifile ]]
then
newifile=$(mktemp --tmpdir prepro.XXX.i)
convert "$ifile" "$newifile" && mv "$newifile" "$ifile"
fi
exit $rc
How to use it: call gcc using flags -B (directory where cc1 wrapper resides) and --no-integrated-cpp
Related
I have a file foo.h
The file has some macros and they are repeating twice. I want to add a suffix '_repeat' to the next occurrence.
There are many such repetitions in the file. I want to write a generic script to add suffix to the second occurence.
For eg.
/* Info: VARIABLE */
#define VARIABLE 02
I want it to look like,
/* Info: VARIABLE */
#define VARIABLE_repeat 02
This might work for you (GNU sed):
sed -r '/VARIABLE/{:a;N;/(VARIABLE).*\1/!ba;s//&_repeat/}' file
Gather up lines on encountering the required string until a second occurence is matched then append a suffix.
Absolutely no guarantees of robustness, but you could start with:
awk '$1 == "#define" && a[$2]++{ $2=$2 "_repeat" a[$2]} 1' foo.h
this will handle multiple repetitions, appending "_repeat2", "_repeat3", etc. to duplicates.
awk '($1=="#define"){a[$2]++}(a[$2]==2){$2=$2 "_repeat"}1' data.txt
Here's a perl version that - to me at least - seems much easier to understand:
perl -p -e 's/(#define\s+)(\w+)/"$1$2".(++$w{$2} > 1 ? "_repeated" : "")/e' < foo.h
or here's another version:
perl -p -e 's/(#define\s+)(\w+)/$x='_repeated' if ++$w{$2} > 1; "$1$2$x"/e' < foo.h
I'm using getopt to parse the parameters from the command line, and I have problems to make it recognize the optional parameter order.
I want to achieve these cases:
$ ./program -i file.html -o output.html #ex 1
$ ./program -i -o file.html output.html #ex 2
$ ./program -o output.html -i file.html #ex 3
my code looks like this
while((c = getopt(argc, argv, "hi:o:")) != -1) {
switch(c) {
case 'h':
//prints the help file
break;
case 'i':
ivalue = optarg;
break;
case 'f':
fvalue = optarg;
break;
case '?':
//prints an error
break;
default:
abort();
}
}
to debug better this I've also wrote outside the while
for(int i = optind; i < argc; i++) {
printf("non optional argument %s\n", argv[i]);
return 0;
}
so the examples 1 and 3 are correctly working, while the example 2 is not getting the parameters straight. At first I thought that it was simply not possible with this function, but then in this example I saw it was.
There's also a bonus question: how come calling the program with no parameters doesn't abort()?
I'm using ubuntu 15.10 and gcc 5.2.1 installed with apt-get (dunno if useful but better safe that sorry).
in this example I saw it was [possible].
No, you didn't. Your option -i requires an argument, but in your non-working case you try to insert a different option between the -i and its argument. That is not allowed -- if an option takes an argument, whether optional or required, that argument (if provided) must immediately follow the option letter. The example you linked does not show differently.
What you are trying to do not only is not supported by getopt(), it does not follow standard Unix convention. Moreover, although it could conceivably work the way you imagine for options with required arguments, it is wholly unworkable for options with optional arguments. When arguments are optional, there is no way to correctly match them with their options if they are not required to follow directly.
There's also a bonus question: how come calling the program with no parameters doesn't abort()?
Why would it? If no options are specified to the program on its command line then getopt() returns -1 on the first call, and the loop body is never executed. To get an abort() from your code, you would need to specify an option that is in the option string, but for which you do not provide a specific case, or else an option that requires an argument, but without providing the argument (in which event getopt() returns ':', for which you do not provide a specific case). As the code is presently written, that would need to be -o, either with or without an argument, or -i without an argument.
What you're trying to do isn't possible with getopt. The "i:" in your optstring means that if the -i option is present, it must have argument. In your second example this means that -o is interpreted as the argument to -i and not as an option.
If you're on Linux and linking with glibc you can use the :: GNU extension (eg. "hi::o:") to make -i take an optional argument. However this would break your first and third examples as the optional argument is required to appear immediately after option (eg. -ifile.html).
To get around this "limitation"... I'd more call it intended functionality, I used flags and functions. You move through each getopt argument and set flags and variables as you pass through (and in some cases send to functions to get returns and unset other flags if you want a particular one to take precedent over another. Then once complete you perform some mental math to determine what happens in what order.
For example for a script to make ACL setting easier for an end user (psuedocode areas are clearly marked):
getHelp () {
printf "\nThis does stuff with [OPTIONS] and stuff...\n\n"
exit 0
}
main () {
if [[ $targetFile == "-h" ]]; then
usage
exit 0
fi
[[ ! -f $targetFile ]] && [[ ! -d $targetF ]] && {
printf "\nYou must specify a target File or Folder. Exiting.\n"
exit 1
}
modeOctal=1
if [[ readMode -eq 1 ]] || [[ writeMode -eq 1 ]]; then
printf "We're doing stuff to this file or folder...\n"
if [[ readMode -eq 1 ]]; then
modeOctal=`expr $modeOctal + 4`
elif [[ writeMode -eq 1 ]]; then
modeOctal=`expr $modeOctal + 2`
fi
*code to do stuff to setfacl on target*
fi
if [[ removeMode -eq 1 ]]; then
*code to do stuff to setfacl -x permissions on target*
fi
while true; do
case "$1" in
-h)
getHelp
exit 0
;;
-r)
readMode=1
;;
-w)
writeMode=1
;;
-R)
readMode=0
writeMode=0
removeMode=1
;;
-t)
shift
targetFile=$1
targetIsSet=1
;;
--)
shift
break
;;
esac
shift
done
if [[ $targetIsSet == 1 ]]; then
main
exit 0
fi
exit 0
Since you can count on things going in the same order everytime, you can be sure that your execution will be consistent no matter what order you go in. In this example, you also allow for someone using the -t option which requires a second argument to use '-h' to get help there instead.
You can use similar logic to put scripting in the functions that handle your '-i' or '-o' to behave differently if the next item is the other option (ex: -i -o output.file or -o -i input.file)
gcc 4.7.2
c89
Hello,
I am wondering does any one know of any tutorials or text books that cover using makefile to create some simple unit testing for my c programs.
I would like to run some automated testing that will create a test suite and add this to my Makefile.
Just want some ideas on how to get started.
Many thanks for any suggestions
Yes indeed, less than 30 lines of makefile you can build a generic unit test engine.
Note that I wrote the following for testing gawk and lisp scripts but it can be easily customized for c. Actually, IMHO, the whole thing is a nice example of the power of shell scripting.
To begin, you place all your tests is executable files in some $Testdir. In this example, all the tests have file names 001 002, etc (with no extension).
Next, you need some set up stuff:
Here=$(PWD)
Testdir=$(Here)/eg
ready: $(Testdir) $(Tmp)
$(Tmp) :
# - [ ! -d $(Tmp) ] && mkdir $(Tmp)
Then you'll need to collect all the tests into a list called $Tests
Tests:=$(shell cd $(Testdir); ls | egrep '^[0-9]+$$' | sort -n )
(Note the use of :=. The is a slight optimization that builds $Tests once, and uses it many times.)
Each file $(X) in my list of $Tests can be executed in two ways. Firstly, you can just run it. Secondly, you can run it and cache the results in $(X).want.
run : ready $(Testdir)/$(X)
#echo $X 2>&1
#cat $(Testdir)/$(X) | $(Run)
cache : ready
#$(MAKE) run | tee $(Testdir)/$X.want
#echo new test result cached to $(Testdir)/$X.want
I cache a test outcome once the test is ready to go and is producing the right output.
The actual execution is defined by a magic variable called $(Run). This is something you have to write specifically for the language being tested. For the record, I'm testing Gawk scripts so my $(Run) is just as follows (and you can change it to whatever you need).
Run= gawk -f mycode.awk
Once that is done, then to run one test, I just compare what I get after running $(X) to the cached copy:
test : ready $(Testdir)/$(X).want
#$(MAKE) run > $(Tmp)/$X.got
#if diff -s $(Tmp)/$X.got $(Testdir)/$X.want > /dev/null; \
then echo PASSED $X ; \
else echo FAILED $X, got $(Tmp)/$X.got; \
fi
This is how I run all my tests:
tests:; #$(foreach x, $(Tests), $(MAKE) X=$x test;)
You can also do a batch cache of all the current outputs (warning: do not do this unless your tests are currently generating the right output):
cache :
#$(foreach x, $(Tests), $(MAKE) X=$x cache;)
Finally, if you want, you can also get final score of the PASSEDs and FAILEDs:
score :
#$(MAKE) tests | cut -d\ -f 1 | egrep '(PASSED|FAILED)' | sort | uniq -c
That's it: as promised- a generic unit tool in Make in under 30 lines. Share and enjoy.
Generating Test Function Snippets
Generating function snippets is easy with the help of gccxml and nm. The following bash script generate_snippets.sh can be called with one command line argument to generate a test function snippet for each function defined in a source file:
#!/bin/bash
#
# Generate stub functions for one file
#
# # Initialize
FILE=$1
[ ! -e "$FILE" ] && echo "file doesn't exist" && exit -1
# # Compile
OBJECT=$(mktemp)
gcc -c $FILE -o $OBJECT
# # Get Functions
# ## Get all symbols in the text sections
Y=$(mktemp)
nm $OBJECT | grep " T " | awk '{print $3;}' | sort > $Y
# ## Get functions defined in the file (including #includes)
# get all functions defined in the compilation unit of $FILE, excluding included
# dynamically linked functions
XML=$(mktemp)
X=$(mktemp)
gccxml $FILE -fxml=$XML
grep "<Function" $XML | sed 's/^.*name="\([^"]*\).*/\1/g' | sort > $X
# ## get the common lines
# This is done to get those functions which are defined in the source file and end up in
# the compiled object file.
COMMON=$(comm $Y $X -1 -2)
# # Create stubs
for func in $COMMON;
do
cat <<_
// Test stub for $func. Returns 1 if it fails.
char test_$func() {
return 1;
}
_
done
# # Clean up
rm $OBJECT $XML $X $Y
TODOS
The script is not yet perfect. You should probably include a test to only generate test functions for those functions which aren't tested yet. As this is done analogous to finding the common names between $X and $Y, I leave this as an exercise. When this is implemented, it makes sense to run this script from a Makefile. See the other answer for pointers on that.
Example Usage
Consider the C file hello.c:
#include <stdio.h>
int foo();
int bar();
int main() {
printf("hello, world\n");
return 0;
}
int foo() {
printf("nothing\n");
return 1;
}
int bar() {
printf("still nothing\n");
return 1;
}
Running the script above with this file as input yields the following output:
// Test stub for bar. Returns 1 if it fails.
char test_bar() {
return 1;
}
// Test stub for foo. Returns 1 if it fails.
char test_foo() {
return 1;
}
// Test stub for main. Returns 1 if it fails.
char test_main() {
return 1;
}
Just put those snippets into the appropriate file and fill them with logic as needed. After that, compile the test suite and run it.
I have a header that defines a large number of macros, some of whom depend on other macros -- however, the dependencies are all resolved within this header.
I need a one-liner for printing out the value of a macro defined in that header.
As an example:
#define MACRO_A 0x60000000
#define MACRO_B MACRO_A + 0x00010000
//...
As a first blush:
echo MACRO_B | ${CPREPROCESSOR} --include /path/to/header
... which nearly gives me what I want:
# A number of lines that are not important
# ...
0x60000000 + 0x00010000
... however, I'm trying to keep this from ballooning into a huge sequence of "pipe it to this, then pipe it to that ...".
I've also tried this:
echo 'main(){ printf( "0x%X", MACRO_B ); }' \
| ${CPREPROCESSOR} --include /path/to/header --include /usr/include/stdio.h
... but it (the gcc compiler) complains that -E is required when processing code on standard input, so I end up having to write out to a temporary file to compile/run this.
Is there a better way?
-Brian
echo 'void main(){ printf( "0x%X", MACRO_B ); }' \
| gcc -x c --include /path/to/header --include /usr/include/stdio.h - && ./a.out
will do it in one line.
(You misread the error GCC gives when reading from stdin. You need -E or -x (needed to specify what language is expected))
Also, it's int main(), or, when you don't care like here, just drop the return type entirely. And you don't need to specify the path for stdio.h.
So slightly shorter:
echo 'main(){printf("0x%X",MACRO_B);}' \
| gcc -xc --include /path/to/header --include stdio.h - && ./a.out
What about tail -n1? Like this:
$ echo C_IRUSR | cpp --include /usr/include/cpio.h | tail -n 1
000400
How about artificially generating an error that contains your MACRO_B value in it, and then compiling the code?
I think the easiest way would be to write a small C program, include the header to that, and print the desired output. Then you can use it in your script, makefile or whatever.
echo '"EOF" EOF' | cpp --include /usr/include/stdio.h | grep EOF
prints:
"EOF" (-1)
I have code that has a lot of complicated #define error codes that are not easy to decode since they are nested through several levels.
Is there any elegant way I can get a list of #defines with their final numerical values (or whatever else they may be)?
As an example:
<header1.h>
#define CREATE_ERROR_CODE(class, sc, code) ((class << 16) & (sc << 8) & code)
#define EMI_MAX 16
<header2.h>
#define MI_1 EMI_MAX
<header3.h>
#define MODULE_ERROR_CLASS MI_1
#define MODULE_ERROR_SUBCLASS 1
#define ERROR_FOO CREATE_ERROR_CODE(MODULE_ERROR_CLASS, MODULE_ERROR_SUBCLASS, 1)
I would have a large number of similar #defines matching ERROR_[\w_]+ that I'd like to enumerate so that I always have a current list of error codes that the program can output. I need the numerical value because that's all the program will print out (and no, it's not an option to print out a string instead).
Suggestions for gcc or any other compiler would be helpful.
GCC's -dM preprocessor option might get you what you want.
I think the solution is a combo of #nmichaels and #aschepler's answers.
Use gcc's -dM option to get a list of the macros.
Use perl or awk or whatever to create 2 files from this list:
1) Macros.h, containing just the #defines.
2) Codes.c, which contains
#include "Macros.h"
ERROR_FOO = "ERROR_FOO"
ERROR_BAR = "ERROR_BAR"
(i.e: extract each #define ERROR_x into a line with the macro and a string.
now run gcc -E Codes.c. That should create a file with all the macros expanded. The output should look something like
1 = "ERROR_FOO"
2 = "ERROR_BAR"
I don't have gcc handy, so haven't tested this...
The program 'coan' looks like the tool you are after. It has the 'defs' sub-command, which is described as:
defs [OPTION...] [file...] [directory...]
Select #define and #undef directives from the input files in accordance with the options and report them on the standard output in accordance with the options.
See the cited URL for more information about the options. Obtain the code here.
If you have a complete list of the macros you want to see, and all are numeric, you can compile and run a short program just for this purpose:
#include <header3.h>
#include <stdio.h>
#define SHOW(x) printf(#x " = %lld\n", (long long int) x)
int main(void) {
SHOW(ERROR_FOO);
/*...*/
return 0;
}
As #nmichaels mentioned, gcc's -d flags may help get that list of macros to show.
Here's a little creative solution:
Write a program to match all of your identifiers with a regular expression (like \#define :b+(?<NAME>[0-9_A-Za-z]+):b+(?<VALUE>[^(].+)$ in .NET), then have it create another C file with just the names matched:
void main() {
/*my_define_1*/ my_define_1;
/*my_define_2*/ my_define_2;
//...
}
Then pre-process your file using the /C /P option (for VC++), and you should get all of those replaced with the values. Then use another regex to swap things around, and put the comments before the values in #define format -- now you have the list of #define's!
(You can do something similar with GCC.)
Is there any elegant way I can get a list of #defines with their final numerical values
For various levels of elegance, sort of.
#!/bin/bash
file="mount.c";
for macro in $(grep -Po '(?<=#define)\s+(\S+)' "$file"); do
echo -en "$macro: ";
echo -en '#include "'"$file"'"\n'"$macro\n" | \
cpp -E -P -x c ${CPPFLAGS} - | tail -n1;
done;
Not foolproof (#define \ \n macro(x) ... would not be caught - but no style I've seen does that).