Related
Which one is better to use among the below statements in C?
static const int var = 5;
or
#define var 5
or
enum { var = 5 };
It depends on what you need the value for. You (and everyone else so far) omitted the third alternative:
static const int var = 5;
#define var 5
enum { var = 5 };
Ignoring issues about the choice of name, then:
If you need to pass a pointer around, you must use (1).
Since (2) is apparently an option, you don't need to pass pointers around.
Both (1) and (3) have a symbol in the debugger's symbol table - that makes debugging easier. It is more likely that (2) will not have a symbol, leaving you wondering what it is.
(1) cannot be used as a dimension for arrays at global scope; both (2) and (3) can.
(1) cannot be used as a dimension for static arrays at function scope; both (2) and (3) can.
Under C99, all of these can be used for local arrays. Technically, using (1) would imply the use of a VLA (variable-length array), though the dimension referenced by 'var' would of course be fixed at size 5.
(1) cannot be used in places like switch statements; both (2) and (3) can.
(1) cannot be used to initialize static variables; both (2) and (3) can.
(2) can change code that you didn't want changed because it is used by the preprocessor; both (1) and (3) will not have unexpected side-effects like that.
You can detect whether (2) has been set in the preprocessor; neither (1) nor (3) allows that.
So, in most contexts, prefer the 'enum' over the alternatives. Otherwise, the first and last bullet points are likely to be the controlling factors — and you have to think harder if you need to satisfy both at once.
If you were asking about C++, then you'd use option (1) — the static const — every time.
Generally speaking:
static const
Because it respects scope and is type-safe.
The only caveat I could see: if you want the variable to be possibly defined on the command line. There is still an alternative:
#ifdef VAR // Very bad name, not long enough, too general, etc..
static int const var = VAR;
#else
static int const var = 5; // default value
#endif
Whenever possible, instead of macros / ellipsis, use a type-safe alternative.
If you really NEED to go with a macro (for example, you want __FILE__ or __LINE__), then you'd better name your macro VERY carefully: in its naming convention Boost recommends all upper-case, beginning by the name of the project (here BOOST_), while perusing the library you will notice this is (generally) followed by the name of the particular area (library) then with a meaningful name.
It generally makes for lengthy names :)
In C, specifically? In C the correct answer is: use #define (or, if appropriate, enum)
While it is beneficial to have the scoping and typing properties of a const object, in reality const objects in C (as opposed to C++) are not true constants and therefore are usually useless in most practical cases.
So, in C the choice should be determined by how you plan to use your constant. For example, you can't use a const int object as a case label (while a macro will work). You can't use a const int object as a bit-field width (while a macro will work). In C89/90 you can't use a const object to specify an array size (while a macro will work). Even in C99 you can't use a const object to specify an array size when you need a non-VLA array.
If this is important for you then it will determine your choice. Most of the time, you'll have no choice but to use #define in C. And don't forget another alternative, that produces true constants in C - enum.
In C++ const objects are true constants, so in C++ it is almost always better to prefer the const variant (no need for explicit static in C++ though).
The difference between static const and #define is that the former uses the memory and the later does not use the memory for storage. Secondly, you cannot pass the address of an #define whereas you can pass the address of a static const. Actually it is depending on what circumstance we are under, we need to select one among these two. Both are at their best under different circumstances. Please don't assume that one is better than the other... :-)
If that would have been the case, Dennis Ritchie would have kept the best one alone... hahaha... :-)
In C #define is much more popular. You can use those values for declaring array sizes for example:
#define MAXLEN 5
void foo(void) {
int bar[MAXLEN];
}
ANSI C doesn't allow you to use static consts in this context as far as I know. In C++ you should avoid macros in these cases. You can write
const int maxlen = 5;
void foo() {
int bar[maxlen];
}
and even leave out static because internal linkage is implied by const already [in C++ only].
Another drawback of const in C is that you can't use the value in initializing another const.
static int const NUMBER_OF_FINGERS_PER_HAND = 5;
static int const NUMBER_OF_HANDS = 2;
// initializer element is not constant, this does not work.
static int const NUMBER_OF_FINGERS = NUMBER_OF_FINGERS_PER_HAND
* NUMBER_OF_HANDS;
Even this does not work with a const since the compiler does not see it as a constant:
static uint8_t const ARRAY_SIZE = 16;
static int8_t const lookup_table[ARRAY_SIZE] = {
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16}; // ARRAY_SIZE not a constant!
I'd be happy to use typed const in these cases, otherwise...
If you can get away with it, static const has a lot of advantages. It obeys the normal scope principles, is visible in a debugger, and generally obeys the rules that variables obey.
However, at least in the original C standard, it isn't actually a constant. If you use #define var 5, you can write int foo[var]; as a declaration, but you can't do that (except as a compiler extension" with static const int var = 5;. This is not the case in C++, where the static const version can be used anywhere the #define version can, and I believe this is also the case with C99.
However, never name a #define constant with a lowercase name. It will override any possible use of that name until the end of the translation unit. Macro constants should be in what is effectively their own namespace, which is traditionally all capital letters, perhaps with a prefix.
#define var 5 will cause you trouble if you have things like mystruct.var.
For example,
struct mystruct {
int var;
};
#define var 5
int main() {
struct mystruct foo;
foo.var = 1;
return 0;
}
The preprocessor will replace it and the code won't compile. For this reason, traditional coding style suggest all constant #defines uses capital letters to avoid conflict.
It is ALWAYS preferable to use const, instead of #define. That's because const is treated by the compiler and #define by the preprocessor. It is like #define itself is not part of the code (roughly speaking).
Example:
#define PI 3.1416
The symbolic name PI may never be seen by compilers; it may be removed by the preprocessor before the source code even gets to a compiler. As a result, the name PI may not get entered into the symbol table. This can be confusing if you get an error during compilation involving the use of the constant, because the error message may refer to 3.1416, not PI. If PI were defined in a header file you didn’t write, you’d have no idea where that 3.1416 came from.
This problem can also crop up in a symbolic debugger, because, again, the name you’re programming with may not be in the symbol table.
Solution:
const double PI = 3.1416; //or static const...
I wrote quick test program to demonstrate one difference:
#include <stdio.h>
enum {ENUM_DEFINED=16};
enum {ENUM_DEFINED=32};
#define DEFINED_DEFINED 16
#define DEFINED_DEFINED 32
int main(int argc, char *argv[]) {
printf("%d, %d\n", DEFINED_DEFINED, ENUM_DEFINED);
return(0);
}
This compiles with these errors and warnings:
main.c:6:7: error: redefinition of enumerator 'ENUM_DEFINED'
enum {ENUM_DEFINED=32};
^
main.c:5:7: note: previous definition is here
enum {ENUM_DEFINED=16};
^
main.c:9:9: warning: 'DEFINED_DEFINED' macro redefined [-Wmacro-redefined]
#define DEFINED_DEFINED 32
^
main.c:8:9: note: previous definition is here
#define DEFINED_DEFINED 16
^
Note that enum gives an error when define gives a warning.
The definition
const int const_value = 5;
does not always define a constant value. Some compilers (for example tcc 0.9.26) just allocate memory identified with the name "const_value". Using the identifier "const_value" you can not modify this memory. But you still could modify the memory using another identifier:
const int const_value = 5;
int *mutable_value = (int*) &const_value;
*mutable_value = 3;
printf("%i", const_value); // The output may be 5 or 3, depending on the compiler.
This means the definition
#define CONST_VALUE 5
is the only way to define a constant value which can not be modified by any means.
Although the question was about integers, it's worth noting that #define and enums are useless if you need a constant structure or string. These are both usually passed to functions as pointers. (With strings it's required; with structures it's much more efficient.)
As for integers, if you're in an embedded environment with very limited memory, you might need to worry about where the constant is stored and how accesses to it are compiled. The compiler might add two consts at run time, but add two #defines at compile time. A #define constant may be converted into one or more MOV [immediate] instructions, which means the constant is effectively stored in program memory. A const constant will be stored in the .const section in data memory. In systems with a Harvard architecture, there could be differences in performance and memory usage, although they'd likely be small. They might matter for hard-core optimization of inner loops.
Don't think there's an answer for "which is always best" but, as Matthieu said
static const
is type safe. My biggest pet peeve with #define, though, is when debugging in Visual Studio you cannot watch the variable. It gives an error that the symbol cannot be found.
Incidentally, an alternative to #define, which provides proper scoping but behaves like a "real" constant, is "enum". For example:
enum {number_ten = 10;}
In many cases, it's useful to define enumerated types and create variables of those types; if that is done, debuggers may be able to display variables according to their enumeration name.
One important caveat with doing that, however: in C++, enumerated types have limited compatibility with integers. For example, by default, one cannot perform arithmetic upon them. I find that to be a curious default behavior for enums; while it would have been nice to have a "strict enum" type, given the desire to have C++ generally compatible with C, I would think the default behavior of an "enum" type should be interchangeable with integers.
A simple difference:
At pre-processing time, the constant is replaced with its value.
So you could not apply the dereference operator to a define, but you can apply the dereference operator to a variable.
As you would suppose, define is faster that static const.
For example, having:
#define mymax 100
you can not do printf("address of constant is %p",&mymax);.
But having
const int mymax_var=100
you can do printf("address of constant is %p",&mymax_var);.
To be more clear, the define is replaced by its value at the pre-processing stage, so we do not have any variable stored in the program. We have just the code from the text segment of the program where the define was used.
However, for static const we have a variable that is allocated somewhere. For gcc, static const are allocated in the text segment of the program.
Above, I wanted to tell about the reference operator so replace dereference with reference.
We looked at the produced assembler code on the MBF16X... Both variants result in the same code for arithmetic operations (ADD Immediate, for example).
So const int is preferred for the type check while #define is old style. Maybe it is compiler-specific. So check your produced assembler code.
I am not sure if I am right but in my opinion calling #defined value is much faster than calling any other normally declared variable (or const value).
It's because when program is running and it needs to use some normally declared variable it needs to jump to exact place in memory to get that variable.
In opposite when it use #defined value, the program don't need to jump to any allocated memory, it just takes the value. If #define myValue 7 and the program calling myValue, it behaves exactly the same as when it just calls 7.
After reading through some of K&R's The C Programming Language I came across the #define symbolic constants. I decided to define...
#define INTEGER_EXAMPLE 2
#define CHAR_EXAMPLE 2
...so my question is how does C know if I'm defining an int or a char type?
#define-d names have no types. They just define textual replacements.
What the compiler is seeing is the preprocessed form. If using GCC, try gcc -C -E somesource.c and have a look at the (preprocessed) output.
In the 1980s the preprocessor was a separate program.
Read about the cpp preprocessor, and preprocessor and C preprocessor wikipages.
You could even define ill-defined names like
#define BAD #*?$ some crap $?
And even more scary you can define things which are syntactically incomplete like
#define BADTASTE 2 +
and later code BADTASTE 3
Actually, you want to use parenthesis when defining macros. If you have
#define BADPROD(x,y) x*y
then BADPROD(2+3,4+5) is expanded to 2+3*4+5 which the compiler understands like 2+ (3*4) +5; you really want
#define BETTERPROD(x,y) ((x)*(y))
So that BETTERPROD(2+3,4+5) is expanded to ((2+3)*(4+5))
Avoid side-effects in macro arguments, e.g. BETTERPROD(j++,j--)
In general, use macros with care and have them stay simple.
Regarding these defines, it doesn't, the expanded macros doesn't have a type. The pre-processor which processes the #define is just replacing text within the source code
When you use these defines somewhere, e.g.
int i = INTEGER_EXAMPLE;
This will expand to
int i = 2;
Here the literal 2 (which in this context is an int) is assigned to an int.
You could also do:
char c = INTEGER_EXAMPLE;
Here too, the literal 2 is an int, and it is assigned to a char. 2 is within the limits of a char though, so all is ok.
You could even do:
int INTEGER_EXAMPLE = 2;
This would expand to
int 2 = 2;
Which isn't valid C.
#define STRING VALUE
is just an instruction for the pre-processor to replace the STRING with VALUE
afterwards the compiler will take control and will check the types
It doesn't, this is the preprocessor. The type of the constant is dependent on the context in which it is used. For instance:
#define INT_EXAMPLE 257
char foo = INT_EXAMPLE;
will attempt to assign 257 in a char context which should generate a warning unless char has more than 8 bits on your computer.
#Defines are nothing but literal replacements of values. You might want to use
static const
As it respects scope and is type-safe. Try this:
#define main no_main
int main() // gets replaced as no_main by preprocessor
{
return 0;
}
Should give you linking errors. Or you could try and fool your teacher by this
#define I_Have_No_Main_Function main //--> Put this in header file 1.h
#include"1.h"
int I_Have_No_Main_Function()
{
return 0;
}
It doesn't. The #define statements are processed before the compiler starts its work. Basically the pre-processor does a search and replace for what you wrote and replaces it, for instance, all instances of INTEGER_EXAMPLE are replaced with the string 2.
It is up to the compiler to decide the type of that 2 based on where it's used:
int x = INTEGER_EXAMPLE; // 2 is an integer
char y = INTEGER_EXAMPLE; // 2 is a char
Preprocessor cannot know the type of the macro definition. Preprocessor will just replace all occurrence of 'CHAR_EXAMPLE' with '2'. I would use cast:
#define CHAR_EXAMPLE ((char)2)
I am confused about when to use macros or enums. Both can be used as constants, but what is the difference between them and what is the advantage of either one? Is it somehow related to compiler level or not?
In terms of readability, enumerations make better constants than macros, because related values are grouped together. In addition, enum defines a new type, so the readers of your program would have easier time figuring out what can be passed to the corresponding parameter.
Compare
#define UNKNOWN 0
#define SUNDAY 1
#define MONDAY 2
#define TUESDAY 3
...
#define SATURDAY 7
to
typedef enum {
UNKNOWN,
SUNDAY,
MONDAY,
TUESDAY,
...
SATURDAY,
} Weekday;
It is much easier to read code like this
void calendar_set_weekday(Weekday wd);
than this
void calendar_set_weekday(int wd);
because you know which constants it is OK to pass.
A macro is a preprocessor thing, and the compiled code has no idea about the identifiers you create. They have been already replaced by the preprocessor before the code hits the compiler. An enum is a compile time entity, and the compiled code retains full information about the symbol, which is available in the debugger (and other tools).
Prefer enums (when you can).
In C, it is best to use enums for actual enumerations: when some variable can hold one of multiple values which can be given names. One advantage of enums is that the compiler can perform some checks beyond what the language requires, like that a switch statement on the enum type is not missing one of the cases. The enum identifiers also propagate into the debugging information. In a debugger, you can see the identifier name as the value of an enum variable, rather than just the numeric value.
Enumerations can be used just for the side effect of creating symbolic constants of integral type. For instance:
enum { buffer_size = 4096 }; /* we don't care about the type */
this practice is not that wide spread. For one thing, buffer_size will be used as an integer and not as an enumerated type. A debugger will not render 4096 into buffer_size, because that value won't be represented as the enumerated type. If you declare some char array[max_buffer_size]; then sizeof array will not show up as buffer_size. In this situation, the enumeration constant disappears at compile time, so it might as well be a macro. And there are disadvantages, like not being able to control its exact type. (There might be some small advantage in some situation where the output of the preprocessing stages of translation is being captured as text. A macro will have turned into 4096, whereas buffer_size will stay as buffer_size).
A preprocessor symbol lets us do this:
#define buffer_size 0L /* buffer_size is a long int */
Note that various values from C's <limits.h> like UINT_MAX are preprocessor symbols and not enum symbols, with good reasons for that, because those identifiers need to have a precisely determined type. Another advantage of a preprocessor symbol is that we can test for its presence, or even make decisions based on its value:
#if ULONG_MAX > UINT_MAX
/* unsigned long is wider than unsigned int */
#endif
Of course we can test enumerated constants also, but not in such a way that we can change global declarations based on the result.
Enumerations are also ill suited for bitmasks:
enum modem_control { mc_dsr = 0x1, mc_dtr = 0x2, mc_rts = 0x4, ... }
it just doesn't make sense because when the values are combined with a bitwise OR, they produce a value which is outside of the type. Such code causes a headache, too, if it is ever ported to C++, which has (somewhat more) type-safe enumerations.
Note there are some differences between macros and enums, and either of these properties may make them (un)suitable as a particular constant.
enums are signed (compatible with int). In any context where an unsigned type is required (think especially bitwise operations!), enums are out.
if long long is wider than int, big constants won't fit in an enum.
The size of an enum is (usually) sizeof(int). For arrays of small values (up to say, CHAR_MAX) you might want a char foo[] rather than an enum foo[] array.
enums are integral numbers. You can't have enum funny_number { PI=3.14, E=2.71 }.
enums are a C89 feature; K&R compilers (admittedly ancient) don't understand them.
If macro is implemented properly (i.e it does not suffer from associativity issues when substituted), then there's not much difference in applicability between macro and enum constants in situations where both are applicable, i.e. in situation where you need signed integer constants specifically.
However, in general case macros provide much more flexible functionality. Enums impose a specific type onto your constants: they will have type int (or, possibly, larger signed integer type), and they will always be signed. With macros you can use constant syntax, suffixes and/or explicit type conversions to produce a constant of any type.
Enums work best when you have a group of tightly associated sequential integer constants. They work especially well when you don't care about the actual values of the constants at all, i.e. when you only care about them having some well-behaved unique values. In all other cases macros are a better choice (or basically the only choice).
As a practical matter, there is little difference. They are equally usable as constants in your programs. Some may prefer one or the other for stylistic reasons, but I can't think of any technical reason to prefer one over the other.
One difference is that macros allow you to control the integral type of related constants. But an enum will use an int.
#define X 100L
enum { Y = 100L };
printf("%ld\n", X);
printf("%d\n", Y); /* Y has int type */
enum has an advantage: block scope:
{ enum { E = 12 }; }
{ enum { E = 13 }; }
With macros there is a need to #undef.
I was just curious to know if it is possible to have a pointer referring to #define constant. If yes, how to do it?
The #define directive is a directive to the preprocessor, meaning that it is invoked by the preprocessor before anything is even compiled.
Therefore, if you type:
#define NUMBER 100
And then later you type:
int x = NUMBER;
What your compiler actually sees is simply:
int x = 100;
It's basically as if you had opened up your source code in a word processor and did a find/replace to replace each occurrence of "NUMBER" with "100". So your compiler has no idea about the existence of NUMBER. Only the pre-compilation preprocessor knows what NUMBER means.
So, if you try to take the address of NUMBER, the compiler will think you are trying to take the address of an integer literal constant, which is not valid.
No, because #define is for text replacement, so it's not a variable you can get a pointer to -- what you're seeing is actually replaced by the definition of the #define before the code is passed to the compiler, so there's nothing to take the address of. If you need the address of a constant, define a const variable instead (C++).
It's generally considered good practice to use constants instead of macros, because of the fact that they actually represent variables, with their own scoping rules and data types. Macros are global and typeless, and in a large program can easily confuse the reader (since the reader isn't seeing what's actually there).
#define defines a macro. A macro just causes one sequence of tokens to be replaced by a different sequence of tokens. Pointers and macros are totally distinct things.
If by "#define constant" you mean a macro that expands to a numeric value, the answer is still no, because anywhere the macro is used it is just replaced with that value. There's no way to get a pointer, for example, to the number 42.
No ,It's Not possible in C/C++
You can use the #define directive to give a meaningful name to a constant in your program
We can able to use in two forms.
Please : See this link
http://msdn.microsoft.com/en-us/library/teas0593%28VS.80%29.aspx
The #define directive can contain an object-like definition or a function-like definition.
Iam sorry iam unable to provide one more wink ... Please see the IBM links..since below i pasted linke link
u can get full info from above 2 links
There is a way to overcome this issue:
#define ROW 2
void foo()
{
int tmpInt = ROW;
int *rowPointer = &tmpInt;
// ...
}
Or if you know it's type you can even do that:
void getDefinePointer(int * pointer)
{
*pointer = ROW;
}
And use it:
int rowPointer = NULL;
getDefinePointer(&rowPointer2);
printf("ROW==%d\n", rowPointer2);
and you have a pointer to #define constant.
I know this is really basic, but its got me stumped...
In Objective-C I'm trying to write:
const int BUF_SIZE = 3;
static char buffer[BUF_SIZE+1];
But I get a storage size of buffer isn't constant. How do I make Xcode realise that I'm setting it to a constant, + 1...? Or is this not possible...?
Thanks...!
Joel
I think it's a C thing—if I recall correctly, C only allows you to specify array sizes with literal expressions (no symbols whatsoever). I'd just use a #define constant as a workaround.
You can use an enum:
enum
{
BUF_SIZE = 3
};
Or a macro
#define BUF_SIZE 3
Happens in gcc with things like:
#define LPBUFFER_LGTH ((int) (2*MS25))
as well.
Workaround as above: hardcode the constant you want. I think the problem is with a 'define of a define' ie. twice.