This question already has answers here:
"static const" vs "#define" vs "enum"
(17 answers)
Closed 6 years ago.
The community reviewed whether to reopen this question 1 year ago and left it closed:
Original close reason(s) were not resolved
In many programs a #define serves the same purpose as a constant. For example.
#define FIELD_WIDTH 10
const int fieldWidth = 10;
I commonly see the first form preferred over the other, relying on the pre-processor to handle what is basically an application decision. Is there a reason for this tradition?
There is a very solid reason for this: const in C does not mean something is constant. It just means a variable is read-only.
In places where the compiler requires a true constant (such as for array sizes for non-VLA arrays), using a const variable, such as fieldWidth is just not possible.
They're different.
const is just a qualifier, which says that a variable cannot be changed at runtime. But all other features of the variable persist: it has allocated storage, and this storage may be addressed. So code does not just treat it as a literal, but refers to the variable by accessing the specified memory location (except if it is static const, then it can be optimized away), and loading its value at runtime. And as a const variable has allocated storage, if you add it to a header and include it in several C sources, you'll get a "multiple symbol definition" linkage error unless you mark it as extern. And in this case the compiler can't optimize code against its actual value (unless global optimization is on).
#define simply substitutes a name with its value. Furthermore, a #define'd constant may be used in the preprocessor: you can use it with #ifdef to do conditional compilation based on its value, or use the stringizing operator # to get a string with its value. And as the compiler knows its value at compile time it may optimize code based on that value.
For example:
#define SCALE 1
...
scaled_x = x * SCALE;
When SCALE is defined as 1 the compiler can eliminate the multiplication as it knows that x * 1 == x, but if SCALE is an (extern) const, it will need to generate code to fetch the value and perform the multiplication because the value will not be known until the linking stage. (extern is needed to use the constant from several source files.)
A closer equivalent to using #define is using enumerations:
enum dummy_enum {
constant_value = 10010
};
But this is restricted to integer values and doesn't have advantages of #define, so it is not widely used.
const is useful when you need to import a constant value from some library where it was compiled in. Or if it is used with pointers. Or if it is an array of constant values accessed through a variable index value. Otherwise, const has no advantages over #define.
The reason is that most of the time, you want a constant, not a const-qualified variable. The two are not remotely the same in the C language. For example, variables are not valid as part of initializers for static-storage-duration objects, as non-vla array dimensions (for example the size of an array in a structure, or any array pre-C99).
Expanding on R's answer a little bit: fieldWidth is not a constant expression; it's a const-qualified variable. Its value is not established until run-time, so it cannot be used where a compile-time constant expression is required (such as in an array declaration, or a case label in a switch statement, etc.).
Compare with the macro FIELD_WIDTH, which after preprocessing expands to the constant expression 10; this value is known at compile time, so it can be used for array dimensions, case labels, etc.
To add to R.'s and Bart's answer: there is only one way to define symbolic compile time constants in C: enumeration type constants. The standard imposes that these are of type int. I personally would write your example as
enum { fieldWidth = 10 };
But I guess that taste differs much among C programmers about that.
Although a const int will not always be appropriate, an enum will usually work as a substitute for the #define if you are defining something to be an integral value. This is actually my preference in such a case.
enum { FIELD_WIDTH = 16384 };
char buf[FIELD_WIDTH];
In C++ this is a huge advantage as you can scope your enum in a class or namespace, whereas you cannot scope a #define.
In C you don't have namespaces and cannot scope an enum inside a struct, and am not even sure you get the type-safety, so I cannot actually see any major advantage, although maybe some C programmer there will point it out to me.
According to K&R (2nd edition, page 211) the "const and volatile properties are new with the ANSI standard". This may imply that really old ANSI code did not have these keywords at all and it really is just a matter of tradition.
Moreover, it says that a compiler should detect attempts to change const variables but other than that it may ignore these qualifiers. I think it means that some compilers may not optimize code containing const variable to be represented as intermediate value in machine code (like #define does) and this might cost in additional time for accessing far memory and affect performance.
Some C compilers will store all const variables in the binary, which if preparing a large list of coefficients can use up a tremendous amount of space in the embedded world.
Conversely: using const allows flashing over an existing program to alter specific parameters.
The best way to define numeric constants in C is using enum. Read the corresponding chapter of K&R's The C Programming Language, page 39.
Related
I am not a developer but I understand some C concepts. However, I'm having a hard time finding where the enums (e.g NR_LRU_LISTS, etc) in meminfo.c/meminfo_proc_show() and the variables (e.g. totalram_pages, etc) in page_alloc.c/si_meminfo() are set.
What I meant by set is for example NR_LRU_LISTS = 324077 for instance. What I understood there is that LRU_ACTIVE_FILE equals 3, but there's no = operator in front of NR_LRU_LISTS, so it must be set somewhere else.
I've clicked on the enums/variables to see where they may be called, but there's either too much unrelevant or either non-defining references.
The last thing would be me not being aware of something, but what ?
To be honest, my goal here is to determine how /proc/meminfo 's values are calculated.
But, here my question is: Where do these enums and variables are set ?
Update 1:
The enums part is now solved, and NR_LRU_LISTS equals 5.
But the totalram_pages part seems to be harder to find out...
The constants you are asking about are defined using C's "enum" feature.
enum Foo { A = 4, B, C };
declares constants named A, B, and C with values 4, 5, 6 respectively.
Each constant with no initializer is set to one more than the previous constant. If the first constant in an enum declaration has no initializer it is set to zero.
The variables you are asking about are defined with no initializer, at file scope (that is, outside of any function). For instance, totalram_pages is defined on line 128 of page_alloc.c, with a public declaration for use throughout the kernel on line 50 of linux/mm.h. Because they are defined at file scope and they don't have initializers, they are initialized to zero at program start. (This is a crucial difference from variables defined inside a function with no initializers. Those start off with "indeterminate" values, reading which provokes undefined behavior.)
I do not know how totalram_pages receives a meaningful value. This code is too complicated for me to want to track that down right now.
It sounds like you are just beginning to learn C. Studying other people's code is a good way to learn, but you should start with simple programs. The Linux kernel is not simple, and because it's an operating system kernel, it also does a lot of things that would be considered bad style or just plain wrong in any other program. Don't start with it.
... That said, declaring a bunch of related constants using an enum and letting them take sequential values implicitly is totally normal and good style, and so is defining variables at file scope with no initializer and relying on them to be zero at program start. (It is often wrong to have a global variable in the first place, but if you genuinely need one, relying on implicit initialization to zero is not wrong.) These are things you need to understand and things you are likely to want to do yourself in due course.
I'm quite often confused when coming back to C by the inability to create an array using the following initialisation pattern...
const int SOME_ARRAY_SIZE = 6;
const int myArray[SOME_ARRAY_SIZE];
My understanding of the problem is that the const operator does not guarantee const-ness but rather merely asserts that the value pointed to by SOME_ARRAY_SIZE will not change at runtime. But why can the compiler not assume that the value is constant at compile time? It says 6 right there in the source code...
I think I'm missing something core in my fundamental understanding of C. Somebody help me out here. :)
[UPDATE]After reading a bit more around C99 and variable length arrays I think I understand this a bit better. What I was trying to create was a variable length array - const does not create a compile time constant but rather a runtime constant. Therfore I was initialising a variable length array, which is only valid in C99 at a function/block scope. A variable length array at the file scope is impossible as the compiler cannot assign a fixed memory address to an unbounded array.[/UPDATE]
Well, in C++ the semantics are a bit different. In C++ your code would work fine. You must distinguish between 2 things, const and constant expression. Const means simply, as you described, that the value is read-only. constant expression, on the other hand, means the value is known compile time and is a compile-time constant. The semantics of const in C are always of the first type. The only constant expressions in C are literals, that's why #define is used for such kind of things.
In C++ however, any const object initialized with a constant expression is in itself a constant expression.
I don't know exactly WHY this is so in C, it's just the way it is
The problem is that the language syntax demands a integer value between the [ ]. SOME_ARRAY_SIZE is still a variable (even if you told the compiler nobody is allowed to vary it!)
The const keyword is basically a read-only indication. It does not, really, indicate the underlying value will not change, even though that is the case in your example.
When it comes to pointers, this is more clear:
void foo(int const * p)
{
if (*p == 100)
{
bar();
/* Here, the compiler can not assume that *p is 100 */
}
}
In this case, a compiler should not accept the code in your example, as it requires the array size to be constant. If it would accept it, the user could later run into trouble when porting the code a more strict compiler.
You can do this in C99, and some compilers prior to C99 also had support for this as an extension to C89 (e.g. gcc). If you're stuck with an old compiler that doesn't have C99 support though (e.g. MSVC) then you'll have to do it the old skool way and use a #define for the array size.
Note that that above comments apply only to such declarations at local scope (i.e. automatic variables). C99 still doesn't allow such declarations at global scope.
i just did a very quick test with my Xcode and Objective C file I currently had open on my machine and put this in the .m file:
const int arrs = 6;
const int arr[arrs];
This compiles without any issues.
What's the difference between using a define statement and an enum statement in C/C++ (and is there any difference when using them with either C or C++)?
For example, when should one use
enum {BUFFER = 1234};
over
#define BUFFER 1234
enum defines a syntactical element.
#define is a pre-preprocessor directive, executed before the compiler sees the code, and therefore is not a language element of C itself.
Generally enums are preferred as they are type-safe and more easily discoverable. Defines are harder to locate and can have complex behavior, for example one piece of code can redefine a #define made by another. This can be hard to track down.
#define statements are handled by the pre-processor before the compiler gets to see the code so it's basically a text substitution (it's actually a little more intelligent with the use of parameters and such).
Enumerations are part of the C language itself and have the following advantages.
1/ They may have type and the compiler can type-check them.
2/ Since they are available to the compiler, symbol information on them can be passed through to the debugger, making debugging easier.
Enums are generally prefered over #define wherever it makes sense to use an enum:
Debuggers can show you the symbolic name of an enums value ("openType: OpenExisting", rather than "openType: 2"
You get a bit more protection from name clashes, but this isn't as bad as it was (most compilers warn about re#defineition.
The biggest difference is that you can use enums as types:
// Yeah, dumb example
enum OpenType {
OpenExisting,
OpenOrCreate,
Truncate
};
void OpenFile(const char* filename, OpenType openType, int bufferSize);
This gives you type-checking of parameters (you can't mix up openType and bufferSize as easily), and makes it easy to find what values are valid, making your interfaces much easier to use. Some IDEs can even give you intellisense code completion!
Define is a preprocessor command, it's just like doing "replace all" in your editor, it can replace a string with another and then compile the result.
Enum is a special case of type, for example, if you write:
enum ERROR_TYPES
{
REGULAR_ERR =1,
OK =0
}
there exists a new type called ERROR_TYPES.
It is true that REGULAR_ERR yields to 1 but casting from this type to int should produce a casting warning (if you configure your compiler to high verbosity).
Summary:
they are both alike, but when using enum you profit the type checking and by using defines you simply replace code strings.
It's always better to use an enum if possible. Using an enum gives the compiler more information about your source code, a preprocessor define is never seen by the compiler and thus carries less information.
For implementing e.g. a bunch of modes, using an enum makes it possible for the compiler to catch missing case-statements in a switch, for instance.
enum can group multiple elements in one category:
enum fruits{ apple=1234, orange=12345};
while #define can only create unrelated constants:
#define apple 1234
#define orange 12345
#define is a preprocessor command, enum is in the C or C++ language.
It is always better to use enums over #define for this kind of cases. One thing is type safety. Another one is that when you have a sequence of values you only have to give the beginning of the sequence in the enum, the other values get consecutive values.
enum {
ONE = 1,
TWO,
THREE,
FOUR
};
instead of
#define ONE 1
#define TWO 2
#define THREE 3
#define FOUR 4
As a side-note, there is still some cases where you may have to use #define (typically for some kind of macros, if you need to be able to construct an identifier that contains the constant), but that's kind of macro black magic, and very very rare to be the way to go. If you go to these extremities you probably should use a C++ template (but if you're stuck with C...).
If you only want this single constant (say for buffersize) then I would not use an enum, but a define. I would use enums for stuff like return values (that mean different error conditions) and wherever we need to distinguish different "types" or "cases". In that case we can use an enum to create a new type we can use in function prototypes etc., and then the compiler can sanity check that code better.
Besides all the thing already written, one said but not shown and is instead interesting. E.g.
enum action { DO_JUMP, DO_TURNL, DO_TURNR, DO_STOP };
//...
void do_action( enum action anAction, info_t x );
Considering action as a type makes thing clearer. Using define, you would have written
void do_action(int anAction, info_t x);
For integral constant values I've come to prefer enum over #define. There seem to be no disadvantages to using enum (discounting the miniscule disadvantage of a bit more typing), but you have the advantage that enum can be scoped, while #define identifiers have global scope that tromps everything.
Using #define isn't usually a problem, but since there are no drawbacks to enum, I go with that.
In C++ I also generally prefer enum to const int even though in C++ a const int can be used in place of a literal integer value (unlike in C) because enum is portable to C (which I still work in a lot) .
If you have a group of constants (like "Days of the Week") enums would be preferable, because it shows that they are grouped; and, as Jason said, they are type-safe. If it's a global constant (like version number), that's more what you'd use a #define for; although this is the subject of a lot of debate.
In addition to the good points listed above, you can limit the scope of enums to a class, struct or namespace. Personally, I like to have the minimum number of relevent symbols in scope at any one time which is another reason for using enums rather than #defines.
Another advantage of an enum over a list of defines is that compilers (gcc at least) can generate a warning when not all values are checked in a switch statement. For example:
enum {
STATE_ONE,
STATE_TWO,
STATE_THREE
};
...
switch (state) {
case STATE_ONE:
handle_state_one();
break;
case STATE_TWO:
handle_state_two();
break;
};
In the previous code, the compiler is able to generate a warning that not all values of the enum are handled in the switch. If the states were done as #define's, this would not be the case.
enums are more used for enumerating some kind of set, like days in a week. If you need just one constant number, const int (or double etc.) would be definetly better than enum. I personally do not like #define (at least not for the definition of some constants) because it does not give me type safety, but you can of course use it if it suits you better.
Creating an enum creates not only literals but also the type that groups these literals: This adds semantic to your code that the compiler is able to check.
Moreover, when using a debugger, you have access to the values of enum literals. This is not always the case with #define.
While several answers above recommend to use enum for various reasons, I'd like to point out that using defines has an actual advantage when developing interfaces. You can introduce new options and you can let software use them conditionally.
For example:
#define OPT_X1 1 /* introduced in version 1 */
#define OPT_X2 2 /* introduced in version 2 */
Then software which can be compiled with either version it can do
#ifdef OPT_X2
int flags = OPT_X2;
#else
int flags = 0;
#endif
While on an enumeration this isn't possible without a run-time feature detection mechanism.
Enum:
1. Generally used for multiple values
2. In enum there are two thing one is name and another is value of name name must be distinguished but value can be same.If we not define value then first value of enum name is 0 second value is 1,and so on, unless explicitly value are specified.
3. They may have type and compiler can type check them
4. Make debugging easy
5. We can limit scope of it up to a class.
Define:
1. When we have to define only one value
2. It generally replace one string to another string.
3. It scope is global we cannot limit its scope
Overall we have to use enum
There is little difference. The C Standard says that enumerations have integral type and that enumeration constants are of type int, so both may be freely intermixed with other integral types, without errors. (If, on the other hand, such intermixing were disallowed without explicit casts, judicious use of enumerations could catch certain programming errors.)
Some advantages of enumerations are that the numeric values are automatically assigned, that a debugger may be able to display the symbolic values when enumeration variables are examined, and that they obey block scope. (A compiler may also generate nonfatal warnings when enumerations are indiscriminately mixed, since doing so can still be considered bad style even though it is not strictly illegal.) A disadvantage is that the programmer has little control over those nonfatal warnings; some programmers also resent not having control over the sizes of enumeration variables.
In C, shall I prefer constants over defines? I've reading a lot of code lately, and all of the examples make heavy use of defines.
No, in general you should not use const-qualified objects in C to create names constants. In order to create a named constant in C you should use either macros (#define) or enums. In fact, C language has no constants, in the sense that you seem to imply. (C is significantly different from C++ in this regard)
In C language the notions of constant and constant expression are defined very differently from C++. In C constant means a literal value, like 123. Here are some examples of constants in C
123
34.58
'x'
Constants in C can be used to build constant expressions. However, since const-qualified objects of any type are not a constants in C, they cannot be used in constant expressions, and, consequently, you cannot use const-qualified objects where constant expressions are required.
For example, the following is not a constant
const int C = 123; /* C is not a constant!!! */
and since the above C is not a constant, it cannot be used to declare an array type in file scope
typedef int TArray[C]; /* ERROR: constant expression required */
It cannot be used as a case label
switch (i) {
case C: ; /* ERROR: constant expression required */
}
It cannot be used as bit-field width
struct S {
int f : C; /* ERROR: constant expression required */
};
It cannot be used as an initializer for an object with static storage duration
static int i = C; /* ERROR: constant expression required */
It cannot be used as a enum initializer
enum {
E = C /* ERROR: constant expression required */
};
i.e it cannot be used anywhere where a constant is required.
This might seem counter-intuitive, but this is how C the language is defined.
This is why you see these numerous #define-s in the code you are working with. Again, in C language const-qualified object have very limited use. They are basically completely useless as "constants", which is why in C language you are basically forced to use #define or enums to declare true constants.
Of course, in situations when a const-qualified object works for you, i.e. it does what you want it to do, it is indeed superior to macros in many ways, since it is scoped and typed. You should probably prefer such objects where applicable, however in general case you'll have to take into account the above limitations.
Constants should be preferred over defines. There are several advantages:
Type safety. While C is a weakly typed languaged, using a define loses all of the type safety, which will allow the compiler to pick up problems for you.
Ease of debugging. You can change the value of constants through the debugger, while defines are automatically changed in the code by the pre-processor to the actual value, meaning that if you want to change the value for test/debugging purposes, you need to re-compile.
Maybe I have been using them wrong but, at least in gcc, you can't use constants in case statements.
const int A=12;
switch (argc) {
case A:
break;
}
Though this question is specific to C, I guess it is good to know this:
#include<stdio.h>
int main() {
const int CON = 123;
int* A = &CON;
(*A)++;
printf("%d\n", CON); // 124 in C
}
works in C, but not in C++
One of the reasons to use #define is to avoid such things to mess up your code, specially it is a mix of C and C++.
A lot of people here are giving you "C++ style" advice. Some even say the C++ arguments apply to C. That may be a fair point. (Whether it is or not feels kind of subjective.) The people who say const sometimes means something different in the two languages are also correct.
But these are mostly minor points and personally, I think in truth there is relatively minor consequence to going either way. It's a matter of style, and I think different groups of people will give you different answers.
In terms of common usage, historical usage, and most common style, in C, it's a lot more typical to see #define. Using C++isms in C code can come off as strange to a certain narrow segment of C coders. (Including me, so that's where my biases lie.)
But I'm surprised no one has suggested a middle ground solution, that "feels right" in both languages: if it fits into a group of integer constants, use an enum.
define can be used for many purposes(very loose) and should be avoided if you can substitute that with const, which define a variable and you can do a lot more with it.
In cases like below, define has to be used
directive switch
substitution to your source line
code macros
An example where you have to use define over const is when you have version number say 3 and you want version 4 to include some methods that is not available in version 3
#define VERSION 4
...
#if VERSION==4
................
#endif
Defines have been part of the language longer than constants, so a lot of older code will use them because defines where the only way to get the job done when the code was written. For more recent code it may be simply a matter of programmer habit.
Constants have a type as well as a value, so they would be preferred when it makes sense for your value to have a type, but not when it is typeless (or polymorphic).
If it's something that isn't determined programmatically, I use #define. For example, if I want all of my UI objects to have the same space between them, I might use #define kGUISpace 20.
Apart from the excellent reasons given by AndreyT for using DEFINES rather than constants in "C" code there is another more pragmatic reason for using DEFINES.
DEFINES are easy define and use from (.h) header files, which is where any experienced C coder would expect to find constants defined. Defining consts in header files is not quite so easy -- its more code to avoid duplicate definitions etc.
Also the "typesafe" arguments are moot most compilers will pick up glaring errors suchh as assing a string to and int, or, will "do the right thing" on a slight mismatch such as assigning an integer to a float.
Macros (defines) can be used by the pre-processor and at compile time, constants cannot.
You can do compile-time checks to make sure a macro is within a valid range (and #error or #fatal if it isn't). You can use default values for a macro if it hasn't already been defined. You can use a macro in the size of an array.
A compiler can optimize with macros better than it can with constants:
const int SIZE_A = 15;
#define SIZE_B 15
for (i = 0; i < SIZE_A + 1; ++i); // if not optimized may load A and add 1 on each pass
for (i = 0; i < SIZE_B + 1; ++i); // compiler will replace "SIZE_B + 1" with 16
Most of my work is with embedded processors that don't have amazing optimizing compilers. Maybe gcc will treat SIZE_A like a macro at some optimization level.
In my college days I read about the auto keyword and in the course of time I actually forgot what it is. It is defined as:
defines a local variable as having a
local lifetime
I never found it is being used anywhere, is it really used and if so then where is it used and in which cases?
If you'd read the IAQ (Infrequently Asked Questions) list, you'd know that auto is useful primarily to define or declare a vehicle:
auto my_car;
A vehicle that's consistently parked outdoors:
extern auto my_car;
For those who lack any sense of humor and want "just the facts Ma'am": the short answer is that there's never any reason to use auto at all. The only time you're allowed to use auto is with a variable that already has auto storage class, so you're just specifying something that would happen anyway. Attempting to use auto on any variable that doesn't have the auto storage class already will result in the compiler rejecting your code. I suppose if you want to get technical, your implementation doesn't have to be a compiler (but it is) and it can theoretically continue to compile the code after issuing a diagnostic (but it won't).
Small addendum by kaz:
There is also:
static auto my_car;
which requires a diagnostic according to ISO C. This is correct, because it declares that the car is broken down. The diagnostic is free of charge, but turning off the dashboard light will cost you eighty dollars. (Twenty or less, if you purchase your own USB dongle for on-board diagnostics from eBay).
The aforementioned extern auto my_car also requires a diagnostic, and for that reason it is never run through the compiler, other than by city staff tasked with parking enforcement.
If you see a lot of extern static auto ... in any code base, you're in a bad neighborhood; look for a better job immediately, before the whole place turns to Rust.
auto is a modifier like static. It defines the storage class of a variable. However, since the default for local variables is auto, you don't normally need to manually specify it.
This page lists different storage classes in C.
The auto keyword is useless in the C language. It is there because before the C language there existed a B language in which that keyword was necessary for declaring local variables. (B was developed into NB, which became C).
Here is the reference manual for B.
As you can see, the manual is rife with examples in which auto is used. This is so because there is no int keyword. Some kind of keyword is needed to say "this is a declaration of a variable", and that keyword also indicates whether it is a local or external (auto versus extrn). If you do not use one or the other, you have a syntax error. That is to say, x, y; is not a declaration by itself, but auto x, y; is.
Since code bases written in B had to be ported to NB and to C as the language was developed, the newer versions of the language carried some baggage for improved backward compatibility that translated to less work. In the case of auto, the programmers did not have to hunt down every occurrence of auto and remove it.
It's obvious from the manual that the now obsolescent "implicit int" cruft in C (being able to write main() { ... } without any int in front) also comes from B. That's another backward compatibility feature to support B code. Functions do not have a return type specified in B because there are no types. Everything is a word, like in many assembly languages.
Note how a function can just be declared extrn putchar and then the only thing that makes it a function that identifier's use: it is used in a function call expression like putchar(x), and that's what tells the compiler to treat that typeless word as a function pointer.
In C auto is a keyword that indicates a variable is local to a block. Since that's the default for block-scoped variables, it's unnecessary and very rarely used (I don't think I've ever seen it use outside of examples in texts that discuss the keyword). I'd be interested if someone could point out a case where the use of auto was required to get a correct parse or behavior.
However, in the C++11 standard the auto keyword has been 'hijacked' to support type inference, where the type of a variable can be taken from the type of its initializer:
auto someVariable = 1.5; // someVariable will have type double
Type inference is being added mainly to support declaring variables in templates or returned from template functions where types based on a template parameter (or deduced by the compiler when a template is instantiated) can often be quite painful to declare manually.
With the old Aztec C compiler, it was possible to turn all automatic variables to static variables (for increased addressing speed) using a command-line switch.
But variables explicitly declared with auto were left as-is in that case. (A must for recursive functions which would otherwise not work properly!)
The auto keyword is similar to the inclusion of semicolons in Python, it was required by a previous language (B) but developers realized it was redundant because most things were auto.
I suspect it was left in to help with the transition from B to C. In short, one use is for B language compatibility.
For example in B and 80s C:
/* The following function will print a non-negative number, n, to
the base b, where 2<=b<=10. This routine uses the fact that
in the ASCII character set, the digits 0 to 9 have sequential
code values. */
printn(n, b) {
extern putchar;
auto a;
if (a = n / b) /* assignment, not test for equality */
printn(a, b); /* recursive */
putchar(n % b + '0');
}
auto can only be used for block-scoped variables. extern auto int is rubbish because the compiler can't determine whether this uses an external definition or whether to override the extern with an auto definition (also auto and extern are entirely different storage durations, like static auto int, which is also rubbish obviously). It could always choose to interpret it one way but instead chooses to treat it as an error.
There is one feature that auto does provide and that's enabling the 'everything is an int' rule inside a function. Unlike outside of a function, where a=3 is interpreted as a definition int a =3 because assignments don't exist at file scope, a=3 is an error inside a function because apparently the compiler always interprets it as an assignment to an external variable rather than a definition (even if there are no extern int a forward declarations in the function or in the file scope), but a specifier like static, const, volatile or auto would imply that it is a definition and the compiler takes it as a definition, except auto doesn't have the side effects of the other specifiers. auto a=3 is therefore implicitly auto int a = 3. Admittedly, signed a = 3 has the same effect and unsigned a = 3 is always an unsigned int.
Also note 'auto has no effect on whether an object will be allocated to a register (unless some particular compiler pays attention to it, but that seems unlikely)'
Auto keyword is a storage class (some sort of techniques that decides lifetime of variable and storage place) example. It has a behavior by which variable made by the Help of that keyword have lifespan (lifetime ) reside only within the curly braces
{
auto int x=8;
printf("%d",x); // here x is 8
{
auto int x=3;
printf("%d",x); // here x is 3
}
printf("%d",x); // here x is 8
}
I am sure you are familiar with storage class specifiers in C which are "extern", "static", "register" and "auto".
The definition of "auto" is pretty much given in other answers but here is a possible usage of "auto" keyword that I am not sure, but I think it is compiler dependent.
You see, with respect to storage class specifiers, there is a rule. We cannot use multiple storage class specifiers for a variable. That is why static global variables cannot be externed. Therefore, they are known only to their file.
When you go to your compiler setting, you can enable optimization flag for speed. one of the ways that compiler optimizes is, it looks for variables without storage class specifiers and then makes an assessment based on availability of cache memory and some other factors to see whether it should treat that variable using register specifier or not. Now, what if we want to optimize our code for speed while knowing that a specific variable in our program is not very important and we dont want compiler to even consider it as register. I though by putting auto, compiler will be unable to add register specifier to a variable since typing "register auto int a;" OR "auto register int a;" raises the error of using multiple storage class specifiers.
To sum it up, I thought auto can prohibit compiler from treating a variable as register through optimization.
This theory did not work for GCC compiler however I have not tried other compilers.