In c, it is not defined by the standard whether enums are signed or unsigned. However, when I try to compare an enum value to the lowest (ie 0) enumeration constant, I get the warning "pointless comparison of unsigned integer with zero." (Compiler is IAR embedded workbench.)
typedef enum
{
BAR,
BAZ
} Foo;
//later...
Foo x = (Foo)some_integral_value;
if (x >= BAR) // <- this gives me the warning
//stuff
I need to check the range of the enum, however, because it is being converted from an integral type. Is there a good way to do this that avoids the warning, which will still work if the compiler decides to change the underlying type?
In C this is a false problem.
all enumeration constants are always of type int, anyhow
conversions back and forth the enumeration type work easily with
implicit conversion, no explicit conversions (AKA cast) are necessary
nor desirable
Now to your example
enum Foo
{
BAR,
BAZ
};
//later...
Foo x = (Foo)some_integral_value;
This doesn't even compile, because in C Foo is not defined to be anything, you must use enum Foo, or provide an appropriate typedef, something like
typedef enum Foo Foo;
Perhaps you compile C code with a C++ compiler? In any case, provide a complete example that shows your problem.
It sounds like in C, all enums should have type int, but I'm leaving these suggestions around, since your compiler is either non-standard C, or compiling as C++.
If you have control over the definitions of these enums, you could just make them start at 1:
enum Foo
{
BAR = 1,
BAZ
};
You could also add a single fake negative value to force it to be signed:
enum Foo
{
NEGATIVE_PLACEHOLDER = -1,
BAR,
BAZ,
};
In C++11, you can give your enum an explicit underlying type:
enum Foo : int
{
BAR,
BAZ
};
See this page, specifically the section that says:
enum name : type { enumerator = constexpr , enumerator = constexpr , ... }
...
2) declares an unscoped enumeration type whose underlying type is fixed
It sounds like the type should always be predictable though:
Values of unscoped enumeration type are implicitly-convertible to integral types. If the underlying type is not fixed, the value is convertible first type from the following list able to hold their entire value range: int, unsigned int, long, unsigned long, long long, or unsigned long long. If the underlying type is fixed, the values can be converted to their promoted underlying type.
So, if your enum fits in an int, it should always use an int.
Related
I have an enum defined like this -
typedef enum
{
emp1 = 0u,
emp2,
emp3
}employid;
C throws me warnings for following operations
Problem 1:
unsigned int var; // 32 bit in my compiler
typedef struct
{
employid e;
}mystruct;
mystruct s;
s.e = var; // **getting warning enumerated type mixed with another type**
Problem 2:
somefun(var); // **getting warning enumerated type mixed with another type**
function definition is somefun(employ e);
I don't understand that even though my enum values are positive since 1st element is 0u why is C compiler shouting at me for assigning it to a unsigned int?
Your code is fine as far as C language rules go. These are just extra diagnostics telling you that the code is fishy.
An unsigned int may hold values not matching any valid enumeration constant, in which case you will end up with s.e holding an invalid value. That is: most of the time, it doesn't really make any sense to mix plain integers with enums. If you find yourself doing so, the root problem is likely on the program design level.
In case you are sure that var holds an ok value, you could do an explicit cast s.e = (employid)var;. But more likely, var should have been declared as employid to begin with.
I am using gcc and I compiled this code and it should have thrown an error but it ran successfully.
enum DIRECTION {EAST,WEST,NORTH,SOUTH};
int main(void) {
enum DIRECTION currentDirection = 10;
printf("%d\n",currentDirection);
return 0;
}
OUTPUT :
10
An enums type is defined in the C99 draft standard section 6.7.2.2 Enumeration specifiers as:
Each enumerated type shall be compatible with char, a signed integer type, or an
unsigned integer type. The choice of type is implementation-defined,110) [...]
where footnote 110 says:
An implementation may delay the choice of which integer type until all enumeration constants have been seen.
the standard does not say you are not allowed to specify a value outside of those specified in the declaration of the enum although in section Annex I Common warnings it does suggest such a warning but it is not required:
A value is given to an object of an enumerated type other than by assignment of an
enumeration constant that is a member of that type, or an enumeration object that has
the same type, or the value of a function that returns the same enumerated type (6.7.2.2).
gcc will not produce a warning although clang with the -Wassign-enum flag or -Weverything flag will and it would look similar to this:
warning: integer constant not in range of enumerated type 'enum DIRECTION' [-Wassign-enum]
and you can use -Werror to make it an error.
Keith makes two interesting observations:
Using -Werror would make clang non-conforming since the code is valid C.
enum DIRECTION currentDirection = 128; has implementation defined behavior since the type could well be char.
In C an enum constant is the equivalent of an int. You can use them interchangeably.
An enum (enumeration) is a type that can hold a set of integer values specified by the user. It's a way of creating symbolic names for a small list of related values. Their purpose is to make programs clearer and more readable.
The values of an enum type are called enumerators. The enumerators are in the same scope as the enum and their values implicitly convert to integers.
A macro is meant for the preprocessor, and the compiled code has no idea about the macros you create. They have been already replaced by the preprocessor before the code hits the compiler. An enum is a compile time entity, and the compiled code retains full information about the symbol, which is available in the debugger (and other tools).
Also it's convenient to add a new symbolic name later and let the values reorder themselves. However enums in C are not strongly typed and are compatible with signed integers. So you can assign any value to an enum type variable.
// APPLE == 0, PEARS == 1, ...
enum fruits {APPLE, PEARS, BANANA};
// APPLE == 0, MANGO == 1, PEARS == 2, ...
enum fruits {APPLE, MANGO, PEARS, BANANA};
enum color {APPLE, PEACH};
enum color my_color = MANGO; // not strongly typed
enum fruits my_fruit = 7; // int -> enum fruits conversion
However enum in C++ is strongly typed.
enum class Traffic_light {red, yellow, green};
enum class Warning {green, yellow, orange, red};
Warning w = 1; // error. no int -> Warning implicit conversion
Traffic_light t = Warning::red; // type error. Warning::red is a different type
Does using typedef enum { VALUE_1 = 0x00, ... } typeName; have any more overhead in C (specifically, compiling using AVR-GCC for an AVR MCU) than doing typedef unsigned char typeName; and then just defining each value with #define VALUE_1 0x00?
My specific application is status codes that can be returned and checked by functions. It seems neater to me to use the typedef enum style, but I wanted to be sure that it wasn't going to add any significant overhead to the compiled application.
I would assume no, but I wasn't really sure. I tried to look for similar questions but most of them pertained to C++ and got more specific answers to C++.
An enum declaration creates an enumerated type. Such a type is compatible with (and therefore has the same size and representation as) some predefined integer type, but the compiler chooses which one.
But the enumeration constants are always of type int. (This differs from C++, where the constants are of the enumerated type.)
So typedef unsigned char ... vs. typedef enum ... will likely change the size and representation of the type, which can matter if you define objects of the type or functions that return the type, but the constants VALUE_1 et al will be of type int either way.
It's probably best to use the enum type; that way the compiler can decide what representation is best. Your alternative of specifying unsigned char will minimize storage, but depending on the platform it might actually slow down access to objects relative to, say, using something compatible with int.
Incidentally, the typedef isn't strictly necessary. If you prefer, you can use a tag:
enum typeName { Value_1 = 0x00, ... };
But then you have to refer to the type as enum typeName rather than just typeName. The advantage of typedef is that it lets you give the type a name that's just a single identifier.
See the simple example below. When a function returning one enum is assigned to a variable of a different enum I don't get any warning even with gcc -Wall -pedantic. Why is it not possible for a C compiler to do type checking on enums? Or is it gcc specific? I don't have access to any other compiler right now to try it out..
enum fruit {
APPLE,
ORANGE
};
enum color {
RED,
GREEN
};
static inline enum color get_color() {
return RED;
}
int main() {
enum fruit ftype;
ftype = get_color();
}
This declaration:
enum fruit {
apple,
orange
};
declares three things: a type called enum fruit, and two enumerators called apple and orange.
enum fruit is actually a distinct type. It's compatible with some implementation-defined integer type; for example, enum fruit might be compatible with int, with char, or even with unsigned long long if the implementation chooses, as long as the chosen type can represent all the values.
The enumerators, on the other hand, are constants of type int. In fact, there's a common trick of using a bare enum declaration to declare int constants without using the preprocessor:
enum { MAX = 1000 };
Yes, that means that the constant apple, even though it was declared as part of the definition of enum fruit, isn't actually of type enum fruit. The reasons for this are historical. And yes, it would probably have made more sense for the enumerators to be constants of the type.
In practice, this inconsistency rarely matters much. In most contexts, discrete types (i.e., integer and enumeration types) are largely interchangeable, and the implicit conversions usually do the right thing.
enum fruit { apple, orange };
enum fruit obj; /* obj is of type enum fruit */
obj = orange; /* orange is of type int; it's
implicitly converted to enum fruit */
if (obj == orange) { /* operands are converted to a common type */
/* ... */
}
But the result is that, as you've seen, the compiler isn't likely to warn you if you use a constant associated with one enumerated type when you mean to use a different one.
One way to get strong type-checking is to wrap your data in a struct:
enum fruit { /* ... */ };
enum color { /* ... */ };
struct fruit { enum fruit f; };
struct color { enum color c; };
struct fruit and struct color are distinct and incompatible types with no implicit (or explicit) conversion between them. The drawback is that you have to refer to the .f or .c member explicitly. (Most C programmers just count on their ability to get things right in the first place -- with mixed results.)
(typedef doesn't give you strong type checking; despite the name, it creates an alias for an existing type, not a new type.)
(The rules in C++ are a little different.)
Probably most of us understand the underlying causes ("the spec says it must work"), but we also agree that this is a cause of a lot of programming errors in "C" land and that the struct wrapping workaround is gross. Ignoring add-on checkers such as lint, here's what we have:
gcc (4.9): No warning available.
microsoft cl (18.0): No warning available.
clang (3.5): YES -Wenum-conversion
gcc decided not to warn (as does clang) but icc (Intel compiler) would warn in this situation. If you want some additional type checking for enum types, you can pass your code to some static code checker software like Lint that is able to warn in such cases.
gcc decided it was not useful to warn for implicit conversions between enum types but note also that C doesn't require the implementation to issue a diagnostic in case of an assignment between two different enum types. This is the same as for the assignment between any arithmetic type: diagnostic is not required by C. For example, gcc would also not warn if you assign a long long to a char or a short to a long.
That's because enums in C are simply a group of unique integer constants, that save you from having to #define a whole bunch of constants. It's not like C++ where the enums you create are of a specific type. That's just how C is.
It's also worth noting that the actual size used to represent enum values depends on the compiler.
10 years after this question was asked, GCC can do it now:
gcc -Wextra main.c
main.c: In function ‘main’:
main.c:17:11: warning: implicit conversion from ‘enum color’ to ‘enum fruit’ [-Wenum-conversion]
17 | ftype = get_color();
An enum in C is basically handled like an integer. It's just a nicer way to use constants.
// this would work as well
ftype = 1;
You can also specify the values:
enum color {
RED=0,GREEN,BLUE
} mycolor;
mycolor = 1; // GREEN
gcc guys always have a reason not to do somthing.
Use clang with options -Wenum-conversion -Wassign-enum.
Is the sizeof(enum) == sizeof(int), always ?
Or is it compiler dependent?
Is it wrong to say, as compiler are optimized for word lengths (memory alignment) ie y int is the word-size on a particular compiler? Does it means that there is no processing penalty if I use enums, as they would be word aligned?
Is it not better if I put all the return codes in an enum, as i clearly do not worry about the values it get, only the names while checking the return types. If this is the case wont #DEFINE be better as it would save memory.
What is the usual practice?
If I have to transport these return types over a network and some processing has to be done at the other end, what would you prefer enums/#defines/ const ints.
EDIT - Just checking on net, as complier don't symbolically link macros, how do people debug then, compare the integer value with the header file?
From Answers —I am adding this line below, as I need clarifications—
"So it is implementation-defined, and
sizeof(enum) might be equal to
sizeof(char), i.e. 1."
Does it not mean that compiler checks for the range of values in enums, and then assign memory. I don't think so, of course I don't know. Can someone please explain me what is "might be".
It is compiler dependent and may differ between enums. The following are the semantics
enum X { A, B };
// A has type int
assert(sizeof(A) == sizeof(int));
// some integer type. Maybe even int. This is
// implementation defined.
assert(sizeof(enum X) == sizeof(some_integer_type));
Note that "some integer type" in C99 may also include extended integer types (which the implementation, however, has to document, if it provides them). The type of the enumeration is some type that can store the value of any enumerator (A and B in this case).
I don't think there are any penalties in using enumerations. Enumerators are integral constant expressions too (so you may use it to initialize static or file scope variables, for example), and i prefer them to macros whenever possible.
Enumerators don't need any runtime memory. Only when you create a variable of the enumeration type, you may use runtime memory. Just think of enumerators as compile time constants.
I would just use a type that can store the enumerator values (i should know the rough range of values before-hand), cast to it, and send it over the network. Preferably the type should be some fixed-width one, like int32_t, so it doesn't come to conflicts when different machines are involved. Or i would print the number, and scan it on the other side, which gets rid of some of these problems.
Response to Edit
Well, the compiler is not required to use any size. An easy thing to see is that the sign of the values matter - unsigned types can have significant performance boost in some calculations. The following is the behavior of GCC 4.4.0 on my box
int main(void) {
enum X { A = 0 };
enum X a; // X compatible with "unsigned int"
unsigned int *p = &a;
}
But if you assign a -1, then GCC choses to use int as the type that X is compatible with
int main(void) {
enum X { A = -1 };
enum X a; // X compatible with "int"
int *p = &a;
}
Using the option --short-enums of GCC, that makes it use the smallest type still fitting all the values.
int main() {
enum X { A = 0 };
enum X a; // X compatible with "unsigned char"
unsigned char *p = &a;
}
In recent versions of GCC, the compiler flag has changed to -fshort-enums. On some targets, the default type is unsigned int. You can check the answer here.
C99, 6.7.2.2p4 says
Each enumerated type shall be
compatible with char, a signed
integer type, or an unsigned
integer type. The choice of type
is implementation-defined,108) but
shall be capable of representing the
values of all the members of the
enumeration. [...]
Footnote 108 adds
An implementation may delay the choice of which integer
type until all enumeration constants have been seen.
So it is implementation-defined, and sizeof(enum) might be equal to sizeof(char), i.e. 1.
In chosing the size of some small range of integers, there is always a penalty. If you make it small in memory, there probably is a processing penalty; if you make it larger, there is a space penalty. It's a time-space-tradeoff.
Error codes are typically #defines, because they need to be extensible: different libraries may add new error codes. You cannot do that with enums.
Is the sizeof(enum) == sizeof(int), always
The ANSI C standard says:
Each enumerated type shall be compatible with char, a signed integer type, or an unsigned integer type. The choice of type is implementation-defined. (6.7.2.2 Enumerationspecifiers)
So I would take that to mean no.
If this is the case wont #DEFINE be better as it would save memory.
In what way would using defines save memory over using an enum? An enum is just a type that allows you to provide more information to the compiler. In the actual resulting executable, it's just turned in to an integer, just as the preprocessor converts a macro created with #define in to its value.
What is the usual practise. I if i have to transport these return types over a network and some processing has to be done at the other end
If you plan to transport values over a network and process them on the other end, you should define a protocol. Decide on the size in bits of each type, the endianess (in which order the bytes are) and make sure you adhere to that in both the client and the server code. Also don't just assume that because it happens to work, you've got it right. It just might be that the endianess, for example, on your chosen client and server platforms matches, but that might not always be the case.
No.
Example: The CodeSourcery compiler
When you define an enum like this:
enum MyEnum1 {
A=1,
B=2,
C=3
};
// will have the sizeof 1 (fits in a char)
enum MyEnum1 {
A=1,
B=2,
C=3,
D=400
};
// will have the sizeof 2 (doesn't fit in a char)
Details from their mailing list
On some compiler the size of an enum is depending on how many entry's are in the Enum. (less than 255 Entrys => Byte, More than 255 Entrys int)
But this is depending on the Compiler and the Compiler Settings.
enum fruits {apple,orange,strawberry,grapefruit};
char fruit = apple;
fruit = orange;
if (fruit < strawberry)
...
all of this works perfectly
if you want a specific underlying type for an enum instance, just don't use the type itself.