Related
K&R c 2nd edition(section 2.3) mentions
A constant expression is an expression that involves only constants. Such expressions may be evaluated at during compilation rather than run-time, and accordingly may be used in any place that a constant can occur
however, I have several doubts regarding it:
Will this expression be considered as constant expression?
const int x=5;
const int y=6;
int z=x+y;
i.e using const keyword is considered constant expression or not?
Is there any technique by which we can check whether an expression was evaluated during compilation or during run-time?
Are there any cases where compile time evaluation produces different result than run-time evaluation?
Should I even care about it? (maybe I use it to optimize my programs)
Perhaps. A compiler can add more forms of constant expressions, so if it can prove to itself that the variable references are constant enough it can compute the expression at compile-time.
You can (of course) disassemble the code and see what the compiler did.
Not if the compiler is standards-compliant, no. The standard says "The semantic rules for the evaluation of a constant expression are the same as for nonconstant expressions" (§6.6 11 in the C11 draft).
Not very much, no. :) But do use const for code like that anyway!
using const keyword is considered constant expression or not?
>> No, it is not a constant. The variable using const is called const qualified, but not a compile time constant.
Is there any technique by which we can check whether an expression was evaluated during compilation or during run-time?
>> (as mentioned in Mr. Unwind's answer) Disassemble the code.
Are there any cases where compile time evaluation produces different result than run-time evaluation?
>> No, it will not. refer to Chapter §6.6 11, C11 standard.
FWIW, in case of usage with sizeof operator (compile time, though not constant expression), NULL pointer dereference will be ok. Compile time NULL pointer dereference invokes undefined behaviour.
Should I even care about it? (maybe I use it to optimize my programs)
>> Opinion-based, so won't answer.
x and y are const, z is not. compiller probably will substitute x and y , but will not substitute z. but probably compiller will calc 5 + 6 as well and will assign to z directly.
not sure you can check generated assembler code, but I do not know how this can be done.
not. compile time means expression is already calculated in run time.
I care :) but it appies only when you need fast execution.
In C, the const qualifier is just a guarantee given by the programmer to the compiler that he will not change the object. Otherwise it does not have special meanings as in C++. The initializer for such objects with file- or global scope has to be a constant expression.
As an extension, gcc has a builtin function (int __builtin_constant_p (exp)) to determine if a value is constant.
No, it shall not - unless you exploit implementation defined or undefined behaviour and compiler and target behave differently. [1]
As constant expressions are evaluated at compile-time, they safe processing time and often code space and possibly data space. Also, in some places (e.g. global initializers), only constant expressions are allowed. See the standard.
[1]: One example is right shifting a signed negative integer constant, e.g. -1 >> 24. As that is implementation defined, the compiler might yield a different result from a program run using a variable which holds the same value:
int i = -1;
(-1 >> 24) == (i >> 24)
^ ^--- run-time evaluated by target
+--- compile-time evaluated by compiler
The comparison might fail.
GCC 4.9 and 5.1 reject this simple C99 declaration at global scope. Clang accepts it.
const int a = 1, b = a; // error: initializer element is not constant
How could such a basic feature be missing? It seems very straightforward.
C991 section 6.6 Constant expressions is the controlling section. It states in subsections 6 and 7:
6/ An integer constant expression shall have integer type and shall only have operands that are integer constants, enumeration constants, character constants, sizeof expressions whose results are integer constants, and floating constants that are the immediate operands of casts.
Cast operators in an integer constant expression shall only convert arithmetic types to integer types, except as part of an operand to the sizeof operator.
The definition of integer and floating point constants is specified in 6.4.4 of the standard, and it's restricted to actual values (literals) rather than variables.
7/ More latitude is permitted for constant expressions in initializers. Such a constant expression shall be, or evaluate to, one of the following (a) an arithmetic constant expression, (b) a null pointer constant, (c) an address constant, or (d) an address constant for an object type plus or minus an integer constant expression.
Since a is none of those things in either subsection 6 or 7, it is not considered a constant expression as per the standard.
The real question, therefore, is not why gcc rejects it but why clang accepts it, and that appears to be buried in subsection 10 of that same section:
10/ An implementation may accept other forms of constant expressions.
In other words, the standard states what an implementation must allow for constant expressions but doesn't limit implementations to allowing only that.
1 C11 is much the same other than minor things like allowing _Alignof as well as sizeof.
This is just the rules of C. It has always been that way. At file scope, initializers must be constant expressions. The definition of a constant expression does not include variables declared with const qualifier.
The rationale behind requiring initializers computable at compile-time was so that the compiler could just put all of the initialized static data as a bloc in the executable file, and then at load time that bloc is loaded into memory as a whole and voila, the global variables all have their correct initial values without any code needing to be executed.
In fact if you could have executable code as initializer for global variables, it introduces quite a lot of complication regarding which order that code should be run in. (This is still a problem in modern C++).
In K&R C, there was no const. They could have had a rule that if a global variable is initialized by a constant expression, then that variable also counts as a constant expression. And when const was added in C89, they could have also added a rule that const int a = 5; leads to a constant expression.
However they didn't. I don't know why sure, but it seems likely that it has to do with keeping the language simple. Consider this:
extern const int a, b = a;
with const int a = 5; being in another unit. Whether or not you want to allow this, it is considerably more complication for the compiler, and some more arbitrary decisions.
If you look at the current C++ rules for constant expressions (which still are not settled to everyone's satisfaction!) you'll see that each time you add support for one more "obvious" thing then there are two other "obvious" things that are next in line and it is never-ending.
In the early days of C, in the 1970s, keeping the compiler simple was important so it may have been that making the compiler support this meant the compiler used too many system resources, or something. (Hopefully a coder from that era can step in and comment more on this!)
Finally, the C89 standardization was quite a contentious process since there were so many different C compilers that had each gone their own way with language evolution. Demanding that a compiler vendor who doesn't support this, change their compiler to support it might be met with opposition, lowering the uptake of the standard.
Because const doesn't make a constant expression -- it makes a variable that can't be assigned to (only initialized). You need constexpr to make a constant expression, which is only available in C++. C99 has no way of making a named constant expression (other than a macro, which is sort-of, but not really an expression at all).
This question already has answers here:
"static const" vs "#define" vs "enum"
(17 answers)
Closed 6 years ago.
The community reviewed whether to reopen this question 1 year ago and left it closed:
Original close reason(s) were not resolved
In many programs a #define serves the same purpose as a constant. For example.
#define FIELD_WIDTH 10
const int fieldWidth = 10;
I commonly see the first form preferred over the other, relying on the pre-processor to handle what is basically an application decision. Is there a reason for this tradition?
There is a very solid reason for this: const in C does not mean something is constant. It just means a variable is read-only.
In places where the compiler requires a true constant (such as for array sizes for non-VLA arrays), using a const variable, such as fieldWidth is just not possible.
They're different.
const is just a qualifier, which says that a variable cannot be changed at runtime. But all other features of the variable persist: it has allocated storage, and this storage may be addressed. So code does not just treat it as a literal, but refers to the variable by accessing the specified memory location (except if it is static const, then it can be optimized away), and loading its value at runtime. And as a const variable has allocated storage, if you add it to a header and include it in several C sources, you'll get a "multiple symbol definition" linkage error unless you mark it as extern. And in this case the compiler can't optimize code against its actual value (unless global optimization is on).
#define simply substitutes a name with its value. Furthermore, a #define'd constant may be used in the preprocessor: you can use it with #ifdef to do conditional compilation based on its value, or use the stringizing operator # to get a string with its value. And as the compiler knows its value at compile time it may optimize code based on that value.
For example:
#define SCALE 1
...
scaled_x = x * SCALE;
When SCALE is defined as 1 the compiler can eliminate the multiplication as it knows that x * 1 == x, but if SCALE is an (extern) const, it will need to generate code to fetch the value and perform the multiplication because the value will not be known until the linking stage. (extern is needed to use the constant from several source files.)
A closer equivalent to using #define is using enumerations:
enum dummy_enum {
constant_value = 10010
};
But this is restricted to integer values and doesn't have advantages of #define, so it is not widely used.
const is useful when you need to import a constant value from some library where it was compiled in. Or if it is used with pointers. Or if it is an array of constant values accessed through a variable index value. Otherwise, const has no advantages over #define.
The reason is that most of the time, you want a constant, not a const-qualified variable. The two are not remotely the same in the C language. For example, variables are not valid as part of initializers for static-storage-duration objects, as non-vla array dimensions (for example the size of an array in a structure, or any array pre-C99).
Expanding on R's answer a little bit: fieldWidth is not a constant expression; it's a const-qualified variable. Its value is not established until run-time, so it cannot be used where a compile-time constant expression is required (such as in an array declaration, or a case label in a switch statement, etc.).
Compare with the macro FIELD_WIDTH, which after preprocessing expands to the constant expression 10; this value is known at compile time, so it can be used for array dimensions, case labels, etc.
To add to R.'s and Bart's answer: there is only one way to define symbolic compile time constants in C: enumeration type constants. The standard imposes that these are of type int. I personally would write your example as
enum { fieldWidth = 10 };
But I guess that taste differs much among C programmers about that.
Although a const int will not always be appropriate, an enum will usually work as a substitute for the #define if you are defining something to be an integral value. This is actually my preference in such a case.
enum { FIELD_WIDTH = 16384 };
char buf[FIELD_WIDTH];
In C++ this is a huge advantage as you can scope your enum in a class or namespace, whereas you cannot scope a #define.
In C you don't have namespaces and cannot scope an enum inside a struct, and am not even sure you get the type-safety, so I cannot actually see any major advantage, although maybe some C programmer there will point it out to me.
According to K&R (2nd edition, page 211) the "const and volatile properties are new with the ANSI standard". This may imply that really old ANSI code did not have these keywords at all and it really is just a matter of tradition.
Moreover, it says that a compiler should detect attempts to change const variables but other than that it may ignore these qualifiers. I think it means that some compilers may not optimize code containing const variable to be represented as intermediate value in machine code (like #define does) and this might cost in additional time for accessing far memory and affect performance.
Some C compilers will store all const variables in the binary, which if preparing a large list of coefficients can use up a tremendous amount of space in the embedded world.
Conversely: using const allows flashing over an existing program to alter specific parameters.
The best way to define numeric constants in C is using enum. Read the corresponding chapter of K&R's The C Programming Language, page 39.
In C, shall I prefer constants over defines? I've reading a lot of code lately, and all of the examples make heavy use of defines.
No, in general you should not use const-qualified objects in C to create names constants. In order to create a named constant in C you should use either macros (#define) or enums. In fact, C language has no constants, in the sense that you seem to imply. (C is significantly different from C++ in this regard)
In C language the notions of constant and constant expression are defined very differently from C++. In C constant means a literal value, like 123. Here are some examples of constants in C
123
34.58
'x'
Constants in C can be used to build constant expressions. However, since const-qualified objects of any type are not a constants in C, they cannot be used in constant expressions, and, consequently, you cannot use const-qualified objects where constant expressions are required.
For example, the following is not a constant
const int C = 123; /* C is not a constant!!! */
and since the above C is not a constant, it cannot be used to declare an array type in file scope
typedef int TArray[C]; /* ERROR: constant expression required */
It cannot be used as a case label
switch (i) {
case C: ; /* ERROR: constant expression required */
}
It cannot be used as bit-field width
struct S {
int f : C; /* ERROR: constant expression required */
};
It cannot be used as an initializer for an object with static storage duration
static int i = C; /* ERROR: constant expression required */
It cannot be used as a enum initializer
enum {
E = C /* ERROR: constant expression required */
};
i.e it cannot be used anywhere where a constant is required.
This might seem counter-intuitive, but this is how C the language is defined.
This is why you see these numerous #define-s in the code you are working with. Again, in C language const-qualified object have very limited use. They are basically completely useless as "constants", which is why in C language you are basically forced to use #define or enums to declare true constants.
Of course, in situations when a const-qualified object works for you, i.e. it does what you want it to do, it is indeed superior to macros in many ways, since it is scoped and typed. You should probably prefer such objects where applicable, however in general case you'll have to take into account the above limitations.
Constants should be preferred over defines. There are several advantages:
Type safety. While C is a weakly typed languaged, using a define loses all of the type safety, which will allow the compiler to pick up problems for you.
Ease of debugging. You can change the value of constants through the debugger, while defines are automatically changed in the code by the pre-processor to the actual value, meaning that if you want to change the value for test/debugging purposes, you need to re-compile.
Maybe I have been using them wrong but, at least in gcc, you can't use constants in case statements.
const int A=12;
switch (argc) {
case A:
break;
}
Though this question is specific to C, I guess it is good to know this:
#include<stdio.h>
int main() {
const int CON = 123;
int* A = &CON;
(*A)++;
printf("%d\n", CON); // 124 in C
}
works in C, but not in C++
One of the reasons to use #define is to avoid such things to mess up your code, specially it is a mix of C and C++.
A lot of people here are giving you "C++ style" advice. Some even say the C++ arguments apply to C. That may be a fair point. (Whether it is or not feels kind of subjective.) The people who say const sometimes means something different in the two languages are also correct.
But these are mostly minor points and personally, I think in truth there is relatively minor consequence to going either way. It's a matter of style, and I think different groups of people will give you different answers.
In terms of common usage, historical usage, and most common style, in C, it's a lot more typical to see #define. Using C++isms in C code can come off as strange to a certain narrow segment of C coders. (Including me, so that's where my biases lie.)
But I'm surprised no one has suggested a middle ground solution, that "feels right" in both languages: if it fits into a group of integer constants, use an enum.
define can be used for many purposes(very loose) and should be avoided if you can substitute that with const, which define a variable and you can do a lot more with it.
In cases like below, define has to be used
directive switch
substitution to your source line
code macros
An example where you have to use define over const is when you have version number say 3 and you want version 4 to include some methods that is not available in version 3
#define VERSION 4
...
#if VERSION==4
................
#endif
Defines have been part of the language longer than constants, so a lot of older code will use them because defines where the only way to get the job done when the code was written. For more recent code it may be simply a matter of programmer habit.
Constants have a type as well as a value, so they would be preferred when it makes sense for your value to have a type, but not when it is typeless (or polymorphic).
If it's something that isn't determined programmatically, I use #define. For example, if I want all of my UI objects to have the same space between them, I might use #define kGUISpace 20.
Apart from the excellent reasons given by AndreyT for using DEFINES rather than constants in "C" code there is another more pragmatic reason for using DEFINES.
DEFINES are easy define and use from (.h) header files, which is where any experienced C coder would expect to find constants defined. Defining consts in header files is not quite so easy -- its more code to avoid duplicate definitions etc.
Also the "typesafe" arguments are moot most compilers will pick up glaring errors suchh as assing a string to and int, or, will "do the right thing" on a slight mismatch such as assigning an integer to a float.
Macros (defines) can be used by the pre-processor and at compile time, constants cannot.
You can do compile-time checks to make sure a macro is within a valid range (and #error or #fatal if it isn't). You can use default values for a macro if it hasn't already been defined. You can use a macro in the size of an array.
A compiler can optimize with macros better than it can with constants:
const int SIZE_A = 15;
#define SIZE_B 15
for (i = 0; i < SIZE_A + 1; ++i); // if not optimized may load A and add 1 on each pass
for (i = 0; i < SIZE_B + 1; ++i); // compiler will replace "SIZE_B + 1" with 16
Most of my work is with embedded processors that don't have amazing optimizing compilers. Maybe gcc will treat SIZE_A like a macro at some optimization level.
This question already has answers here:
"static const" vs "#define" vs "enum"
(17 answers)
Closed 5 years ago.
Can someone point out the advantages and disadvantages of using #define versus constants? Most of my work is done in C and Objective-C.
As 0A0D mentioned, there are #defines, enums, and const variables. It's worth noting that const-qualified variables are not considered to be compile-time constants in C and therefore can't be used in some circumstances (e.g. when declaring the size of an array).
enum constants are compile-time constants, however. For integral values, IMO it's usually better to prefer enums over const variables over #define.
Actually there are three ways of defining such constants,
defines
enums
const variables
In C, everything is an int unless otherwise specified. I prefer enums when I have a number of related integer constants. Enums are clearly preferable when you don't care what the values are. But even when you do need to specify the values for all the constants, I like the mental grouping of an enum. Code documents itself better when you have the type, e.g.
Error MyFunc();
clearly returns one of a particular set of error codes, whereas
int MyFunc()
might return one of the #define'd list for Unix errno, or maybe something else, or maybe those plus some idiosyncratic values -- who knows? If you have more than one set of return codes, which set does this function use?
The more specific enum type name helps the tags facility in your editor, greps, debugging, and so on.
A strict lint may give you some warnings about using enums as integers, for example if you add or or them, or pass an enum to an int.
A const object is different from either an enum or a #define, particularly in C. In ANSI C, a const int takes up space just as a regular int; most compilers will also generate pointer references to this address rather than inlining the value. As a result, I rarely use const int's in C. (C++ has slightly different semantics, and so the choices are different there.)
Every compiler I've ever used has the option to store enums in the smallest space possible. Usually it's even the default option. To force wider enums when using such an option, I usually throw in an extra unsigned value:
typedef enum
{
MyEnumA,
MyEnumB,
MyEnumForce16 = 7fff
} MyEnum;
The use of an enumeration constant (enum) has many advantages over using the traditional symbolic constant style of #define. These advantages include a lower maintenance requirement, improved program readability, and better debugging capability.
1) The first advantage is that enumerated constants are generated automatically by the compiler. Conversely, symbolic constants must be manually assigned values by the programmer.
For instance, if you had an enumerated constant type for error codes that could occur in your program, your enum definition could look something like this:
enum Error_Code
{
OUT_OF_MEMORY,
INSUFFICIENT_DISK_SPACE,
LOGIC_ERROR,
FILE_NOT_FOUND
};
In the preceding example, OUT_OF_MEMORY is automatically assigned the value of 0 (zero) by the compiler because it appears first in the definition. The compiler then continues to automatically assign numbers to the enumerated constants, making INSUFFICIENT_DISK_SPACE equal to 1, LOGIC_ERROR equal to 2, and FILE_NOT_FOUND equal to 3, so on.
If you were to approach the same example by using symbolic constants, your code would look something like this:
#define OUT_OF_MEMORY 0
#define INSUFFICIENT_DISK_SPACE 1
#define LOGIC_ERROR 2
#define FILE_NOT_FOUND 3
Each of the two methods arrives at the same result: four constants assigned numeric values to represent error codes. Consider the maintenance required, however, if you were to add two constants to represent the error codes DRIVE_NOT_READY and CORRUPT_FILE. Using the enumeration constant method, you simply would put these two constants anywhere in the enum definition. The compiler would generate two unique values for these constants. Using the symbolic constant method, you would have to manually assign two new numbers to these constants. Additionally, you would want to ensure that the numbers you assign to these constants are unique.
2) Another advantage of using the enumeration constant method is that your programs are more readable and thus can be understood better by others who might have to update your program later.
3) A third advantage to using enumeration constants is that some symbolic debuggers can print the value of an enumeration constant. Conversely, most symbolic debuggers cannot print the value of a symbolic constant. This can be an enormous help in debugging your program, because if your program is stopped at a line that uses an enum, you can simply inspect that constant and instantly know its value. On the other hand, because most debuggers cannot print #define values, you would most likely have to search for that value by manually looking it up in a header file.
The #define statement is a pre-compiler directive. Technically, any line that begins with a # is something for the pre-compiler to act on. The pre-compiler will replace all instances of the defined token with its definition. So doing this:
#define DELAY 40
for (i=0;i<DELAY;i++) {
for (j=0;j<DELAY;j++) {
asm NOP;
}
}
is exactly the same as this (as far as the compiler is concerned):
for (i=0;i<40;i++) {
for (j=0;j<40;j++) {
asm NOP;
}
}
When the compiler generates machine code, it will see the number 40 and use the immediate addressing mode in order to compare with the accumulator. The number 40 will be stored in the code as many times as you are referencing it. In this case it is twice. Here is the assembly generated by CodeWarrior Ver5:
7: char i,j;
8: for (i=0;i<DELAY;i++) {
0002 95 [2] TSX
0003 7f [2] CLR ,X
0004 [5] L4:
9: for (j=0;j<DELAY;j++) {
0004 6f01 [3] CLR 1,X
0006 [5] L6:
10: asm NOP;
0006 9d [1] NOP
0007 6c01 [4] INC 1,X
0009 e601 [3] LDA 1,X
000b a128 [2] CMP #40 ;<---- notice opcode a1 and immediate constant 40, which is $28 in hexadecimal
000d 25f7 [3] BCS L6
000f 7c [3] INC ,X
0010 f6 [2] LDA ,X
0011 a128 [2] CMP #40 ;<---- and here it is again.
0013 25ef [3] BCS L4
11: }
12: }
13: }
Constants allow you to specify a datatype, which is (usually) an advantage. Macros are much more flexible, and therefore can get you into much more trouble if you're not careful.
Best practice is to use constants as much as possible, and use #define only when you really need a macro, not just a named literal value.
Constants have the advantage of being typed, so using them incorrectly can be discovered at compile time. It may not matter to you but constants take up space in memory while #defines do not (since they are replaced before actual compilation happens).
Constants follow type safety measures, #defines are substituted outright. Also as GMan said, #define's don't respect scope.
Explanation for #define:A #define is either an immediate value or a macro.
Explanation for constant:A constant is a value of any type which can never change.
You can delcare a pointer to a const, but not to a #define, although a #define could be a pointer
for eg: #define ADDRESS ((int *)0x0012)
So why we should use constant is as follows:
they obey the language's scoping rules
you can see them in the debugger
you can take their address if you need to
you can pass them by const-reference if you need to
they don't create new "keywords" in your program.
In short, const identifiers act like they're part of the language because they are part of the language.
Within a module, a C compiler could optimize a const as if it were a #define, if there are no pointers declared to the constant.
In CPU terms, the const would become an "immediate" value.
Other alternatives is that a const variable could be
placed in the code area as opposed to the data area since it doesn't change.
On some machines, declaring a ponter to a constant could cause an exception if you tried to modify the constant via the pointer.
There are cases where #define is needed, but you should generally avoid it when you have the choice. You should evaluate whether to use const or #define based on business value: time, money, risk.
Const is an object you can take its address, for example.
Also it is type-safe, i.e. compiler knows what the constant's type is.
Above does not apply to #define.
const produces an lvalue, meaning its address can be taken. #define doesn't.
#define can cause unintentional macro expansions, which can be a PITA to debug.
As mentioned by others, #define doesn't have a type associated with it.
In general, I'd avoid the preprocessor like the plague for anything I didn't have to use it for, mostly because of the possibility of unintentional expansion and because the ALL_CAPS convention for mitigating this is unbelievably ugly.
1) #define's can be considered as tunable parameters that are independent of the datatypes, whereas constant's allow us to mention the datatype.
2) #define's replaces any code that follows that in the main program where ever they are referred to. In addition to this we can even have macro function performing a specific task, that could be called by passing the parameters alone. These are obviously not possible in the case of constant's.
So, they are used based on the relevance.
the benefit of using define is once you define the variable for exp: #define NUMBER 30, all the code in main will use that code with value 30. If you change the 30 to 40 it will directly change all value in main which using this variable (NUMBER).