difference between #define and enum{} in C [duplicate] - c

What's the difference between using a define statement and an enum statement in C/C++ (and is there any difference when using them with either C or C++)?
For example, when should one use
enum {BUFFER = 1234};
over
#define BUFFER 1234

enum defines a syntactical element.
#define is a pre-preprocessor directive, executed before the compiler sees the code, and therefore is not a language element of C itself.
Generally enums are preferred as they are type-safe and more easily discoverable. Defines are harder to locate and can have complex behavior, for example one piece of code can redefine a #define made by another. This can be hard to track down.

#define statements are handled by the pre-processor before the compiler gets to see the code so it's basically a text substitution (it's actually a little more intelligent with the use of parameters and such).
Enumerations are part of the C language itself and have the following advantages.
1/ They may have type and the compiler can type-check them.
2/ Since they are available to the compiler, symbol information on them can be passed through to the debugger, making debugging easier.

Enums are generally prefered over #define wherever it makes sense to use an enum:
Debuggers can show you the symbolic name of an enums value ("openType: OpenExisting", rather than "openType: 2"
You get a bit more protection from name clashes, but this isn't as bad as it was (most compilers warn about re#defineition.
The biggest difference is that you can use enums as types:
// Yeah, dumb example
enum OpenType {
OpenExisting,
OpenOrCreate,
Truncate
};
void OpenFile(const char* filename, OpenType openType, int bufferSize);
This gives you type-checking of parameters (you can't mix up openType and bufferSize as easily), and makes it easy to find what values are valid, making your interfaces much easier to use. Some IDEs can even give you intellisense code completion!

Define is a preprocessor command, it's just like doing "replace all" in your editor, it can replace a string with another and then compile the result.
Enum is a special case of type, for example, if you write:
enum ERROR_TYPES
{
REGULAR_ERR =1,
OK =0
}
there exists a new type called ERROR_TYPES.
It is true that REGULAR_ERR yields to 1 but casting from this type to int should produce a casting warning (if you configure your compiler to high verbosity).
Summary:
they are both alike, but when using enum you profit the type checking and by using defines you simply replace code strings.

It's always better to use an enum if possible. Using an enum gives the compiler more information about your source code, a preprocessor define is never seen by the compiler and thus carries less information.
For implementing e.g. a bunch of modes, using an enum makes it possible for the compiler to catch missing case-statements in a switch, for instance.

enum can group multiple elements in one category:
enum fruits{ apple=1234, orange=12345};
while #define can only create unrelated constants:
#define apple 1234
#define orange 12345

#define is a preprocessor command, enum is in the C or C++ language.
It is always better to use enums over #define for this kind of cases. One thing is type safety. Another one is that when you have a sequence of values you only have to give the beginning of the sequence in the enum, the other values get consecutive values.
enum {
ONE = 1,
TWO,
THREE,
FOUR
};
instead of
#define ONE 1
#define TWO 2
#define THREE 3
#define FOUR 4
As a side-note, there is still some cases where you may have to use #define (typically for some kind of macros, if you need to be able to construct an identifier that contains the constant), but that's kind of macro black magic, and very very rare to be the way to go. If you go to these extremities you probably should use a C++ template (but if you're stuck with C...).

If you only want this single constant (say for buffersize) then I would not use an enum, but a define. I would use enums for stuff like return values (that mean different error conditions) and wherever we need to distinguish different "types" or "cases". In that case we can use an enum to create a new type we can use in function prototypes etc., and then the compiler can sanity check that code better.

Besides all the thing already written, one said but not shown and is instead interesting. E.g.
enum action { DO_JUMP, DO_TURNL, DO_TURNR, DO_STOP };
//...
void do_action( enum action anAction, info_t x );
Considering action as a type makes thing clearer. Using define, you would have written
void do_action(int anAction, info_t x);

For integral constant values I've come to prefer enum over #define. There seem to be no disadvantages to using enum (discounting the miniscule disadvantage of a bit more typing), but you have the advantage that enum can be scoped, while #define identifiers have global scope that tromps everything.
Using #define isn't usually a problem, but since there are no drawbacks to enum, I go with that.
In C++ I also generally prefer enum to const int even though in C++ a const int can be used in place of a literal integer value (unlike in C) because enum is portable to C (which I still work in a lot) .

If you have a group of constants (like "Days of the Week") enums would be preferable, because it shows that they are grouped; and, as Jason said, they are type-safe. If it's a global constant (like version number), that's more what you'd use a #define for; although this is the subject of a lot of debate.

In addition to the good points listed above, you can limit the scope of enums to a class, struct or namespace. Personally, I like to have the minimum number of relevent symbols in scope at any one time which is another reason for using enums rather than #defines.

Another advantage of an enum over a list of defines is that compilers (gcc at least) can generate a warning when not all values are checked in a switch statement. For example:
enum {
STATE_ONE,
STATE_TWO,
STATE_THREE
};
...
switch (state) {
case STATE_ONE:
handle_state_one();
break;
case STATE_TWO:
handle_state_two();
break;
};
In the previous code, the compiler is able to generate a warning that not all values of the enum are handled in the switch. If the states were done as #define's, this would not be the case.

enums are more used for enumerating some kind of set, like days in a week. If you need just one constant number, const int (or double etc.) would be definetly better than enum. I personally do not like #define (at least not for the definition of some constants) because it does not give me type safety, but you can of course use it if it suits you better.

Creating an enum creates not only literals but also the type that groups these literals: This adds semantic to your code that the compiler is able to check.
Moreover, when using a debugger, you have access to the values of enum literals. This is not always the case with #define.

While several answers above recommend to use enum for various reasons, I'd like to point out that using defines has an actual advantage when developing interfaces. You can introduce new options and you can let software use them conditionally.
For example:
#define OPT_X1 1 /* introduced in version 1 */
#define OPT_X2 2 /* introduced in version 2 */
Then software which can be compiled with either version it can do
#ifdef OPT_X2
int flags = OPT_X2;
#else
int flags = 0;
#endif
While on an enumeration this isn't possible without a run-time feature detection mechanism.

Enum:
1. Generally used for multiple values
2. In enum there are two thing one is name and another is value of name name must be distinguished but value can be same.If we not define value then first value of enum name is 0 second value is 1,and so on, unless explicitly value are specified.
3. They may have type and compiler can type check them
4. Make debugging easy
5. We can limit scope of it up to a class.
Define:
1. When we have to define only one value
2. It generally replace one string to another string.
3. It scope is global we cannot limit its scope
Overall we have to use enum

There is little difference. The C Standard says that enumerations have integral type and that enumeration constants are of type int, so both may be freely intermixed with other integral types, without errors. (If, on the other hand, such intermixing were disallowed without explicit casts, judicious use of enumerations could catch certain programming errors.)
Some advantages of enumerations are that the numeric values are automatically assigned, that a debugger may be able to display the symbolic values when enumeration variables are examined, and that they obey block scope. (A compiler may also generate nonfatal warnings when enumerations are indiscriminately mixed, since doing so can still be considered bad style even though it is not strictly illegal.) A disadvantage is that the programmer has little control over those nonfatal warnings; some programmers also resent not having control over the sizes of enumeration variables.

Related

What is the use of enum types and variables created from the particular enum type

I was reading about enums on stack overflow but couldn't find a satisfactory answer
I just want to know why we create enum type like
enum result {pass,fail}; //result - user defined type
I could have just written
enum{pass,fail};
Only use of enum type I can see is to make variable of that particular type
Like
enum result test;
So only for this purpose is enum type used? ( to create variables of enum type)
Or is there some better practical use of making variables of type enum
Sorry it may seem like a duplicate question
But I did some research wasn't satisfied by answers I read
As I am a beginner in C
The use of an enumeration constant (enum) has many advantages over using the traditional symbolic constant
style of #define. These advantages include a lower maintenance requirement, improved program readability,
and better debugging capability. The first advantage is that enumerated constants are generated automatically
by the compiler. Conversely, symbolic constants must be manually assigned values by the programmer.
For instance, if you had an enumerated constant type for error codes that could occur in your program, your
enum definition could look something like this:
enum Error_Code
{
OUT_OF_MEMORY,
INSUFFICIENT_DISK_SPACE,
LOGIC_ERROR,
FILE_NOT_FOUND
};
In the preceding example, OUT_OF_MEMORY is automatically assigned the value of 0 (zero) by the compiler
because it appears first in the definition. The compiler then continues to automatically assign numbers to
the enumerated constants, making INSUFFICIENT_DISK_SPACE equal to 1, LOGIC_ERROR equal to 2, and so on.
If you were to approach the same example by using symbolic constants, your code would look something
like this:
#define OUT_OF_MEMORY 0
#define INSUFFICIENT_DISK_SPACE 1
#define LOGIC_ERROR 2
#define FILE_NOT_FOUND 3
Each of the two methods arrives at the same result: four constants assigned numeric values to represent error
codes. Consider the maintenance required, however, if you were to add two constants to represent the error
codes DRIVE_NOT_READY and CORRUPT_FILE. Using the enumeration constant method, you simply would put these two constants anywhere in the enum definition. The compiler would generate two unique values for hese constants. Using the symbolic constant method, you would have to manually assign two new numbers to these constants. Additionally, you would want to ensure that the numbers you assign to these constants are unique. Because you don’t have to worry about the actual values, defining your constants using the enumerated method is easier than using the symbolic constant method. The enumerated method also helps prevent accidentally reusing the same number for different constants.
Another advantage of using the enumeration constant method is that your programs are more readable and
thus can be understood better by others who might have to update your program later. For instance, consider
the following piece of code:
void copy_file(char* source_file_name, char* dest_file_name)
{
...
Error_Code err;
if (drive_ready() != TRUE)
err = DRIVE_NOT_READY;
...
}
Looking at this example, you can derive from the definition of the variable err that err should be assigned
only numbers of the enumerated type Error_Code. Hence, if another programmer were to modify or add
functionality to this program, the programmer would know from the definition of Error_Code what
constants are valid for assigning to err.
Conversely, if the same example were to be applied using the symbolic constant method, the code would look
like this:
void copy_file(char* source_file, char* dest_file)
{
...
int err;
...
if (drive_ready() != TRUE)
err = DRIVE_NOT_READY;
...
}
Looking at the preceding example, a programmer modifying or adding functionality to the copy_file()
function would not immediately know what values are valid for assigning to the err variable. The
programmer would need to search for the #define DRIVE_NOT_READY statement and hope that all relevant
constants are defined in the same header file. This could make maintenance more difficult than it needs to be and make your programs harder to understand.
NOTE
Simply defining your variable to be of an enumerated type does not ensure that only valid values
of that enumerated type will be assigned to that variable. In the preceding example, the compiler
will not require that only values found in the enumerated type Error_Code be assigned to err; it
is up to the programmer to ensure that only valid values found in the Error_Code type definition
are used.
A third advantage to using enumeration constants is that some symbolic debuggers can print the value of an
enumeration constant. Conversely, most symbolic debuggers cannot print the value of a symbolic constant.
This can be an enormous help in debugging your program, because if your program is stopped at a line that
uses an enum, you can simply inspect that constant and instantly know its value. On the other hand, because
most debuggers cannot print #define values, you would most likely have to search for that value by manually
looking it up in a header file.
With enum result {pass, fail};, you can create a function with result as a parameter type:
void foo(enum result r);
As well as helping source code readability, it can also assist enormously with debugging: a good debugger will display the enumeration name for r, despite the fact that you can pass any value permissible by the enum's backing type to the function.
enum types allow you to give meaningful names to int values. In your example, you have pass and fail as the values of the enum result type. This allows you to do something like
enum result r = pass;
Alternatively, if you use numbers 0 and 1 instead, you would have to do
int r = 1;
Note that when you read the enum version, the meaning is immediately obvious. When you read the int version, you may ask "Why the value 1?" and "What does 1 stand for?".
In short, enum makes code easier for humans to read.

Advantages in using an enum to define a single value? (C)

Recently, in this question I saw an enum used to define a single value. eg:
enum { BITS_PER_WORD = 32 };
Instead of:
#define BITS_PER_WORD 32
Assuming more members won't be added later, what - if any, are the advantages of doing this? (or is this more a a question of personal taste )
Said differently, if I have existing code using one-off int defines, is there any good reason to change these around for one-off enums shown above?
Out of curiosity I compared GCC's optimized assembler output for some non-trivial code and the result was unchanged betweem enums/defines.
Enumeration constants have several advantages:
They're scoped and don't expand in contexts where they shouldn't (as pointed out by mafso).
Most debuggers are aware of them and can use them in expressions written in the debugger.
Macros have several different advantages:
They can be use in preprocessor conditionals (#if BITS_PER_WORD == 32 won't work if BITS_PER_WORD is an enumeration constant).
They can have arbitrary types (also covered in mafso's answer).
They can be removed (#undef) when no longer needed.
An advantage is that enums are scoped and you can define the enum inside a block. A macro would also expand for e.g.:
foo.BITS_PER_WORD;
or even
void foo(int BITS_PER_WORD) { /* ... */ }
An advantage of a macro is, that you can define it to non-int values.

Is using enums safe in all compilers?

In a discussion, a colleague told me that he never uses enum because he experienced that some C-compilers don't cope with the enum statement correctly.
He couldn't remember which compiler exactly had problems but among the problems, there were errors when doing something like
enum my_enum{
my_enum_first = 5;
my_enum_second = 10;
};
i.e. initializing enum values instead of letting the compiler do the automatic assignment. Another one was that the compiler decides for itself how big the enum is and therefore you could have unpredictable behavior for sizeof my_enum when compiling your code under various platforms.
To get around that, he told me to better use #defines to define the constant elements. But especially for using doxygen it's quite handy to have an enum (e.g. as function parameter) because in the generated documentation, you could simply click on my_enum and directly jump to the description of my_enum.
Another example would be code completion, where your IDE tells you what you could specify as valid parameters for functions. I know that – as long as you're compiling the code as C-code – that there's no type-safety (i.e. I could also specify 5 instead of my_enum_first), so the use of an enum seems to be a more cosmetic thing.
The question is: do you know any compilers that have limitations regarding the usage of enum?
Edit 1:
Regarding the environment: we are developing for various embedded platforms, so there could also be a compiler for some obscure micro-controller...
Edit 2:
He could tell me that the KEIL C51 compiler didn't play well with enums. Are there any experiences with current versions of the C51 compiler?
Compilers are free to choose the size of an enum based on its range of possible values. This only really becomes an issue if you're exposing enums in your API, and users of your code may be using a different compiler or build options.
In this case, confusion can be caused by the calling code passing in a 16-bit value, for example, and the receiving code expecting it to be 32 bits. If the top 16 bits of the passed-in value are left uninitialized, then bad things will happen.
You can work around this kind of issue by including a dummy entry in your enum to enforce a minimum size.
For example:
typedef enum {
FirstValue = 12,
SecondValue = 25,
DummyValue = 65536 // force enum to be greater than 16 bits
} MyEnum;
I'm pretty sure that a compiler that doesn't play nice with enum is an invalid compiler - enum is specified in the standard, so a failure to implement it means the compiler shouldn't technically be used to compile C (For the record, the scope of enumeration types is discussed in 6.2.1 and defined as a type in 6.2.5 (of C99), so one would assume that it's a valid part of the standard from thereon in)!
So no, I don't know of any such compilers.

#define or enum? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Why use enum when #define is just as efficient?
When programming in C, is it better practice to use #define statements or enums for states in a state machine?
Technically it doesn't matter. The compiler will most likely even create identical machine code for either case, but an enumeration has three advantages:
Using the right compiler+debugger combination, the debugger will print enumeration variables by their enumeration name and not by their number. So "StateBlahBlup" reads much nicer than "41", doesn't it?
You don't have explicitly give every state a number, the compiler does the numbering for you if you let it. Let's assume you have already 20 states and you want to add a new state in the middle, in case of defines, you have to do all renumbering on your own. In case of enumeration, you can just add the state and the compiler will renumber all states below this new state for you.
You can tell the compiler to warn you if a switch statement does not handle all the possible enum values, e.g. because you forgot to handle some values or because the enum was extended but you forgot to also update the switch statements handling enum values (it will not warn if there's a default case though, as all values not handled explicitly end up in the default case).
Since the states are related elements I think is better to have an enum defining them.
There's no definitive answer. enum offers you scoping and automatic value assignment, but does not give any control over the constant type (always signed int). #define ignores scoping, but allows you to use better typing facilities: lets you choose the constant type (either by using suffixes or by including an explicit cast into the definition).
So, choose for yourself what is more important to you. For a state machine, enum might be a better choice, unless you have a good reason to control the type.
I prefer enum. They are more compact and are 'safer'. You can also imply order in an enum, which might be helpful in a state machine. #defines should be avoided if possible, since they will overwrite all occurrences in source, which can lead to some unintended actions which are difficult to debug.
If enum is supported by your compiler, then that would be preferred. Failing that, by all means, use #define. All C++ compilers and modern C compilers should support enum, but older compilers (particularly ones targeting embedded platforms) may not support enum.
If you must use #define make sure to define your constants with parentheses, to avoid preprocessor errors:
#define RED_STATE (1)
#define YELLOW_STATE (2)
#define GREEN_STATE (3)
#define directives can have lots of unintended consequences and don't follow common scoping rules. Use enums when you have related data.
More information: http://www.embedded.com/columns/programmingpointers/9900402?_requestid=341945 [C++ material, but still marginally relevant]
You can do this trick to make compiler check the type of #define value.
#define VALUE_NAME ((TYPE_NAME) 12)
However the real problem of #define is it can be redefined in application code. (Of course compiler will warn you about it.)
enum is great when you have exclusive options, but you can't use them to define bitfield flags, like this:
#define SQ_DEFAULT 0x0
#define SQ_WITH_RED 0x1
#define SQ_WITH_BLUE 0x2
void paint_square(int flags);
Then you can paint red-blue square with:
paint_square(SQ_WITH_RED | SQ_WITH_BLUE);
...which you can't with enum.
You can use whatever you want and like.
Still as everyone is saying I would also like add up me as voting for Enums.
Enums should always be preferred if you are using related data as in case of a State Machine, you can define order in enums also that will help in implementing the State Machine.
Further enums will keep your program safe as all enums will be of its type only so they will avoid any possible confusions too.
#define should not be used in case of a state machine or related data. Anyway thats my suggestion, but there is no hard and fast rule.
Also I would like to add up one more point that enums will add more readability and understandability to your code if used in future or or if read by someone else. It is an important point when you are having a very large program and there are a lot of #defines in the program other than you are using for your State Machine.

Shall I prefer constants over defines?

In C, shall I prefer constants over defines? I've reading a lot of code lately, and all of the examples make heavy use of defines.
No, in general you should not use const-qualified objects in C to create names constants. In order to create a named constant in C you should use either macros (#define) or enums. In fact, C language has no constants, in the sense that you seem to imply. (C is significantly different from C++ in this regard)
In C language the notions of constant and constant expression are defined very differently from C++. In C constant means a literal value, like 123. Here are some examples of constants in C
123
34.58
'x'
Constants in C can be used to build constant expressions. However, since const-qualified objects of any type are not a constants in C, they cannot be used in constant expressions, and, consequently, you cannot use const-qualified objects where constant expressions are required.
For example, the following is not a constant
const int C = 123; /* C is not a constant!!! */
and since the above C is not a constant, it cannot be used to declare an array type in file scope
typedef int TArray[C]; /* ERROR: constant expression required */
It cannot be used as a case label
switch (i) {
case C: ; /* ERROR: constant expression required */
}
It cannot be used as bit-field width
struct S {
int f : C; /* ERROR: constant expression required */
};
It cannot be used as an initializer for an object with static storage duration
static int i = C; /* ERROR: constant expression required */
It cannot be used as a enum initializer
enum {
E = C /* ERROR: constant expression required */
};
i.e it cannot be used anywhere where a constant is required.
This might seem counter-intuitive, but this is how C the language is defined.
This is why you see these numerous #define-s in the code you are working with. Again, in C language const-qualified object have very limited use. They are basically completely useless as "constants", which is why in C language you are basically forced to use #define or enums to declare true constants.
Of course, in situations when a const-qualified object works for you, i.e. it does what you want it to do, it is indeed superior to macros in many ways, since it is scoped and typed. You should probably prefer such objects where applicable, however in general case you'll have to take into account the above limitations.
Constants should be preferred over defines. There are several advantages:
Type safety. While C is a weakly typed languaged, using a define loses all of the type safety, which will allow the compiler to pick up problems for you.
Ease of debugging. You can change the value of constants through the debugger, while defines are automatically changed in the code by the pre-processor to the actual value, meaning that if you want to change the value for test/debugging purposes, you need to re-compile.
Maybe I have been using them wrong but, at least in gcc, you can't use constants in case statements.
const int A=12;
switch (argc) {
case A:
break;
}
Though this question is specific to C, I guess it is good to know this:
#include<stdio.h>
int main() {
const int CON = 123;
int* A = &CON;
(*A)++;
printf("%d\n", CON); // 124 in C
}
works in C, but not in C++
One of the reasons to use #define is to avoid such things to mess up your code, specially it is a mix of C and C++.
A lot of people here are giving you "C++ style" advice. Some even say the C++ arguments apply to C. That may be a fair point. (Whether it is or not feels kind of subjective.) The people who say const sometimes means something different in the two languages are also correct.
But these are mostly minor points and personally, I think in truth there is relatively minor consequence to going either way. It's a matter of style, and I think different groups of people will give you different answers.
In terms of common usage, historical usage, and most common style, in C, it's a lot more typical to see #define. Using C++isms in C code can come off as strange to a certain narrow segment of C coders. (Including me, so that's where my biases lie.)
But I'm surprised no one has suggested a middle ground solution, that "feels right" in both languages: if it fits into a group of integer constants, use an enum.
define can be used for many purposes(very loose) and should be avoided if you can substitute that with const, which define a variable and you can do a lot more with it.
In cases like below, define has to be used
directive switch
substitution to your source line
code macros
An example where you have to use define over const is when you have version number say 3 and you want version 4 to include some methods that is not available in version 3
#define VERSION 4
...
#if VERSION==4
................
#endif
Defines have been part of the language longer than constants, so a lot of older code will use them because defines where the only way to get the job done when the code was written. For more recent code it may be simply a matter of programmer habit.
Constants have a type as well as a value, so they would be preferred when it makes sense for your value to have a type, but not when it is typeless (or polymorphic).
If it's something that isn't determined programmatically, I use #define. For example, if I want all of my UI objects to have the same space between them, I might use #define kGUISpace 20.
Apart from the excellent reasons given by AndreyT for using DEFINES rather than constants in "C" code there is another more pragmatic reason for using DEFINES.
DEFINES are easy define and use from (.h) header files, which is where any experienced C coder would expect to find constants defined. Defining consts in header files is not quite so easy -- its more code to avoid duplicate definitions etc.
Also the "typesafe" arguments are moot most compilers will pick up glaring errors suchh as assing a string to and int, or, will "do the right thing" on a slight mismatch such as assigning an integer to a float.
Macros (defines) can be used by the pre-processor and at compile time, constants cannot.
You can do compile-time checks to make sure a macro is within a valid range (and #error or #fatal if it isn't). You can use default values for a macro if it hasn't already been defined. You can use a macro in the size of an array.
A compiler can optimize with macros better than it can with constants:
const int SIZE_A = 15;
#define SIZE_B 15
for (i = 0; i < SIZE_A + 1; ++i); // if not optimized may load A and add 1 on each pass
for (i = 0; i < SIZE_B + 1; ++i); // compiler will replace "SIZE_B + 1" with 16
Most of my work is with embedded processors that don't have amazing optimizing compilers. Maybe gcc will treat SIZE_A like a macro at some optimization level.

Resources