Trying to figure out something simple in a C macro,
like this code for example:
#include <stdio.h>
#define MACRO(b) printf("%d\n", b*b)
int main()
{
MACRO(4+1);
}
The output for this code is 9, I think that it should be 25.
I don't have any idea why and how the result for this is 9 and not 25.
When you use the macro, the preprocessor replaces it and its arguments quite verbatim, so the macro expansion in your code will look like
printf("%d\n", 4+1*4.1);
That will not provide you with the result you want, and it is one of the reasons that function-like macros are looked down upon.
You need to use parentheses to make sure this problem doesn't occur:
#define MACRO(b) printf("%d\n", (b)*(b))
while will then result in the following expansion:
printf("%d\n", (4+1)*(4.1));
Whenever you have problems with something preprocessor related, there are options for just about all compilers to stop after the preprocessing stage, which allows you to look at the preprocessed source and which will help you with problems like this.
Also note that another possible problem here is if the expression you pass as argument to the macro is e.g. a function call with some side-effect that function will be called twice and the side-effect will be happen twice, even when using parentheses.
A simple example, using your macro:
int my_function(void)
{
printf("Foo\n");
return 1;
}
int main(void)
{
MACRO(my_function());
}
The above program will print "Foo\n" twice. If MACRO was a proper function the call to my_function would only happen once and the printout from the function would only happen once.
Put parentheses around your macro to evaluate arguments in the correct order:
#include <stdio.h>
#define MACRO(b) printf("%d\n", (b)*(b))
int main()
{
MACRO(4+1);
}
Related
I'm getting an error that I'm not passing enough arguments to my C Macro, which makes sense, because it's evaluating the macro before a constant I passed it, which is actually supposed to substitute into multiple arguments. What is the best way to fix this?
#include <stdio.h>
#define MYMACRO(a,b,c) ((a)*(b)*(c))
#define MYCONSTANT 3,4
int main(void) {
printf("%d\n", MYMACRO(2, MYCONSTANT));
}
The output is irrelevant to the question. I would just like to know what constitutes the order of #define evaluation.
Macro arguments are not expanded at the call, but in the replacement.
You need an additional level of macro to let it expand the argument. The names in my suggestion are chosen from the number of arguments, but you might want to use other names.
#include <stdio.h>
#define MYMACRO3(a,b,c) ((a)*(b)*(c))
#define MYMACRO2(a,b) MYMACRO3(a,b)
#define MYCONSTANT 3,4
int main(void) {
printf("%d\n", MYMACRO2(2, MYCONSTANT));
}
But as others say, simply avoid preprocessor magic. It will give you all kinds of headaches.
I'm trying to explain this satisfactorily, but when I call a function I want it to virtually insert itself into the main function's code in the place where I call it, so I can save typing it out multiple times, however it directly affects variables defined in the scope of the main function. What's the best way to achieve this?
EDIT: I should probably make it clear I also want it to take a single argument.
Sounds like you need a preprocessor macro. These aren't real functions, but blocks of code the the preprocessor replaces before compiling the code. E.g., consider a simple macro to increment a number:
#include <stdio.h>
#define INC(x) (x)++
int main() {
int a = 1;
INC(a);
INC(a);
printf("%d\n", a);
return 0;
}
The text INC(a) will be replaced with a++, so running this program will print out 3 (1 after two increments).
void fun()
{
// Essentially this is a function with an empty body
// And I don't care about () in a macro
// Because this is evil, regardless
#define printf(a, b) (printf)(a, b*2)
}
void main() // I know this is not a valid main() signature
{
int x = 20;
fun();
x = 10;
printf("%d", x);
}
I am having doubt with #define line ! Can you please give me some links documentation for understanding this line of code.Answer is 20.
The #define defines a preprocessor macro is processed by the preprocessor before the compiler does anything.
The preprocessor doesn't even know if the line of code is inside or outside a function.
Generally macros are defined after inclusion of header files.
i.e. after #include statements.
Preprocessor macros are not part of the actual C language, handling of macros and other preprocessor directives is a separate step done before the compiler1. This means that macros do not follow the rules of C, especially in regards to scoping, macros are always "global".
That means the printf function you think you call in the main function is not actually the printf function, it's the printf macro.
The code you show will look like this after preprocessing (and removal of comments):
void fun()
{
}
void main()
{
int x = 20;
fun();
x = 10;
(printf)("%d", x*2);
}
What happens is that the invocation of the printf macro is replaced with a call to the printf function. And since the second argument of the macro is multiplied by two, the output will be 10 * 2 which is 20.
This program illustrates a major problem with macros: It's to easy to a program look like a normal program, but it does something unexpected. It's simple to define a macro true that actually evaluates to false, and the opposite, changing the meaning of comparisons against true or false completely. The only thing you should learn from an example like this is how bad macros are, and that you should never try to use macros to "redefine" the language or standard functions. When used sparingly and well macros are good and will make programming in C easier. Used wrongly, like in this example, and they will make the code unreadable and unmaintainable.
1 The preprocessor used to be a separate program that ran before the compiler program. Modern compilers have the preprocessor step built-in, but it's still a separate step before the actual C-code is parsed.
Let me put this in another way.
printf() is an inbuilt standard library function that sends formatted output to stdout (Your console screen). The printf() function call is executed during the runtime of the program. The syntax looks likes this.
int printf(const char *format, ...)
But this program of yours replaces the printf() function with your macro before the compilation.
void fun(){
#define printf(a, b) printf(a, b*2)
}
void main() {
int x = 20;
fun();
x = 10;
printf("%d", x);
}
So what happens is, before compilation the compiler replaces the inbuilt function call with your own user defined macro function with two arguments:
a="%d" the format specifier and
b=the value of x =10.
So the value of x*2 =20
This question already has answers here:
Macro expansion in context of arithmetic expression?
(6 answers)
Closed 7 years ago.
#include <stdio.h>
#define sqr(a) a*a
int main()
{
int i;
printf("%d",64/sqr(4));
return 0;
}
Why am I getting the output as 64 .
Normally what should happen is it first checks the value for sqr(4) and then divide . In case of any other operator it works fine .
Please Explain .
After preprocessing the printf line will be:
printf("%d",64/4*4);
which should explain why it prints 64.
Always use parenthesis in macros definitions when they contain expressions:
#define sqr(a) ((a)*(a))
Even this is not safe against macro invocations like: sqr(x++). So don't use marcos unless you have to :)
Macros aren't functions, they're (mostly) dumb text replacement. Your macro lacks parentheses, so when the preprocesor replaces it, you'd get the following:
printf("%d",64/4*4);
Since / and * have the same precedence, this is like stating (64/4)*4, which is, of course, 64.
If you want your sqr macro to safely square its argument, wrap it in parentheses:
#define sqr(a) (a*a)
64/sqr(4) is expanded to 64/4*4.
You have to parenthetise the macro body and should also put parenthesis aroud the argument:
#define sqr(a) ((a)*(a))
Yields:64/(4*4)
Note that macros are pure textual replacements performed by the C preprocessor before the actual compilation starts.
A better and type-safe approach would be to use an inline function:
static inline int sqr(int a)
{
return a * a;
}
This will likely be inlined into the caller code by the compiler as much as the macro would be. OTOH, the compiler might very well decide not to, depending on internal heuristics. In general, you should trust the compiler to use the proper way, unless you have serious performance/code-size issues.
This will also protect against evaluating the argument a twice. That is critical if there are side-effects like
int i[2], *p = i;
sqr(*p++);
For the macro version, this will result in undefined behaviour (undefined order of evaluation):
((*p++) * (*p++))
I was writing some test code in C. By mistake I had inserted a ; after a #define, which gave me errors. Why is a semicolon not required for #defines?
More specifically :
Method 1: works
const int MAX_STRING = 256;
int main(void) {
char buffer[MAX_STRING];
}
Method 2: Does not work - compilation error.
#define MAX_STRING 256;
int main(void) {
char buffer[MAX_STRING];
}
What is the reason of the different behavior of those codes? Are those both MAX_STRINGs not constants?
#define MAX_STRING 256;
means:
whenever you find MAX_STRING when preprocessing, replace it with 256;. In your case it'll make method 2:
#include <stdio.h>
#include <stdlib.h>
#define MAX_STRING 256;
int main(void) {
char buffer [256;];
}
which isn't valid syntax. Replace
#define MAX_STRING 256;
with
#define MAX_STRING 256
The difference between your two codes is that in first method you declare a constant equal to 256 but in the second code you define MAX_STRING to stand for 256; in your source file.
The #define directive is used to define values or macros that are used by the preprocessor to manipulate the program source code before it is compiled. Because preprocessor definitions are substituted before the compiler acts on the source code, any errors that are introduced by #define are difficult to trace.
The syntax is:
#define CONST_NAME VALUE
if there is a ; at the end, it's considered as a part of VALUE.
to understand how exactly #defines work, try defining:
#define FOREVER for(;;)
...
FOREVER {
/perform something forever.
}
Interesting remark by John Hascall:
Most compilers will give you a way to see the output after the preprocessor phase, this can aid with debugging issues like this.
In gcc it can be done with flag -E.
#define is a preprocessor directive, not a statement or declaration as defined by the C grammar (both of those are required to end with a semicolon). The rules for the syntax of each one are different.
define is a preprocessor directive, and is a simple replacement, it is not a declaration.
BTW, as a replacement it may contain some ; as part of it:
// Ugly as hell, but valid
#define END_STATEMENT ;
int a = 1 END_STATEMENT // preprocessed to -> int a = 1;
Both constants? No.
The first method does not produce a constant in C language. Const-qualified variables do not qualify as constants in C. Your first method works only because past-C99 C compilers support variable-length arrays (VLA). Your buffer is a VLA in the first case specifically because MAX_STRING is not a constant. Try declaring the same array in file scope and you'll get an error, since VLAs are not allowed in file scope.
The second method can be used to assign names to constant values in C, but you have to do it properly. The ; in macro definition should not be there. Macros work through textual substitution and you don't want to substitute that extra ; into your array declaration. The proper way to define that macro would be
#define MAX_STRING 256
In C language, when it comes to defining proper named constants, you are basically limited to macros and enums. Don't try to use const "constants", unless you really know that it will work for your purposes.
Because that is how the syntax was decided for the precompiler directives.
Only statements end with a ; in c/c++, #define is a pre-processor directive and not a statement.
The second version does not define a constant as far as the language is concerned, just a substitution rule for a block of text. Once the preprocessor has done it's job, the compiler sees
char buffer [256;];
which is not syntactically valid.
The moral of the story: prefer the const int MAX_STRING = 256; way as that helps you, the compiler, and the debugger.
This preprocessor directive:
#define MAX_STRING 256;
tells the preprocessor to replace all MAX_STRINGs with 256; - and with the semicolon. Preprocessor statements don't need a semicolon at the end. If you put one, the preprocessor actually thinks you mean it with a semicolon.
If you are confused with #defines for constants, const int would probably be easier to comprehend.
If you want to learn more on how to properly use these preprocessor directives, try looking at this website.