C89 vs c99 GCC compiler - c

Is there a difference if I compile the following program using c89 vs c99? I get the same output. Is there really a difference between the two?
#include <stdio.h>
int main ()
{
// Print string to screen.
printf ("Hello World\n");
}
gcc -o helloworld -std=c99 helloworld.c
vs
gcc -o helloworld -std=c89 helloworld.c

// comments are not a part of C89 but are OK in C99,
falling off of main() without returning any value is equivalent to return 0; in C99, but not so in C89. From N1256 (pdf), 5.1.2.2.3p1:
If the return type of the main function is a type compatible with int, a return from the initial call to the main function is equivalent to calling the exit function with the value returned by the main function as its argument; reaching the } that terminates the main function returns a value of 0.
So your code has undefined behavior in C89, and well-defined behavior in C99.

In theory, there should be one difference. Using "//" to demark a comment isn't part of C89, so if it enforced the C89 rules correctly, that would produce a compiler error (with -ansi -pedantic, it might do that, but I don't remember for sure).
That gives an idea of the general character though: if a program compiles as C89, it'll generally also compile as C99, and give exactly the same results. C99 mostly buys you some new features that aren't present in C89, so you can use (for example) variable length arrays, which aren't allowed in C89.
You may have to ask for pedantic rules enforcement to see all the differences though -- C99 is intended to standardize existing practice, and some of the existing practice is gcc extensions, some of which are enabled by default.

on this forum http://www.velocityreviews.com/forums/t287495-p2-iso-c89-and-iso-c99.html i found this:
summary: 99 is standardized, has new keywords, new array stuff, complex numbers, library functions and such. More compilers are c89 complete since they've had all this time to make them so.
A) ANSI X3.159-1989. This is the original 1989 C standard, dated
December 1989, with Rationale. The main body of the language is
described in section 3, and the "C library" -- stdio,
functions, and so on -- in section 4.
B) ISO 9899:1990. This is the original ISO C standard. "ANSI" is the
American National Standards Institute, so the international crowd have
to have their own standards with their own, different, numbering
system. They simply adopted ANSI's 1989 standard, removed the
Rationale, and renumbered the sections (calling them "clauses"
instead). With very few exceptions you can just add three, so that
most of the language is described in section
-- er, "clause" -- 6, and the "C library" part in section 7.
C) ISO 9899:1999. This is the newfangled "C99" standard, with its
Variable Length Arrays, Flexible Array Members, new keywords like
"restrict" and "_Bool", new semantics for the "static" keyword, new
syntax to create anonymous aggregates, new complex-number types,
hundreds of new library functions, and so on.
The new ISO standard was immediately "back-adopted" by ANSI. I have
not seen any official "ANSI-sanctioned" claim about this, but given
the usual numbering systems, I would expect this to be ANSI Standard
number X3.159-1999. (The numbering system is pretty obvious: a
standard, once it comes out, gets a number -- X. for
ANSI, or just a number for ISO -- and a suffix indicating year of
publication. An update to an existing standard reuses the number, with
the new year.)
Although X3.159-1989 and 9899:1990 have different years and section
numbering, they are effectively identical, so "C89" and "C90" really
refer to the same language. Hence you can say either "C89" or "C90"
and mean the same thing, even to those aware of all the subtleties.
There were also several small revisions to the original 1990 ISO
standard: "Normative Addendum 1", and two "Technical Corrigenda"
(numbered; giving Technical Corrigendum 1 and TC2). The two TCs are
considered to be "bug fixes" for glitches in the wording of the
standard, while NA1 is an actual "change". In practice, the TCs do not
really affect users, while NA1 adds a whole slew of functions that
people can use, so NA1 really is more significant. NA1 came out in
1994, so one might refer to "ISO 9899:1990 as modified by NA1" as
"C94". I have seen it called "C95", too.

Related

Have the code examples from K&R ever been conforming?

The C Programming Language by Brian Kernighan and Dennis Ritchie contains a lot of examples such as this famous one (K&R 2nd edition 1.1):
#include <stdio.h>
main()
{
printf("hello, world\n");
}
Here I note the following issues:
No return type.
Writing functions with no return type was allowed in C90 which the second edition of the book claims to follow. These will default to int in C90. It is invalid C in later versions of the language.
No return statement.
A function with a return type and no return statement was not well-defined in C90. Writing main() with no return being equivalent to return 0; was a feature added in C99.
Empty parameter list of main().
This is valid C still (as of C17) but has always been an obsolescent feature even in C90. (Upcoming C23 talks of finally getting rid of K&R style functions.)
My question:
Was any code in K&R 2nd edition ever a conforming program, in any version of the standard?
By definition, any source text or collection thereof which is "accepted" by a Conforming C Implementation is a "Conforming C Program". Because implementations are given broad latitude to extend the language in any way which does not affect the behavior of any Strictly Conforming C Programs, any source text T which would not otherwise be a Conforming C Program could be turned into a Conforming C Program by modifying a Conforming C Implementation so that if it were given program that doesn't match T, it would process it normally, and if fed a copy of T it would behave as though it were fed some other program that it would accept.
While this may seem an absurdly broad definition, it satisfies one of the major goals of the C Standards Committee, which was to ensure that if any existing programs could accomplish a task, the task could be accomplished by a Conforming C Program.
As for whether the programs were Strictly Conforming under C89, that's a bit harder to answer. The Standard says that if execution falls through the end of main() it will return an Undefined Value to the host environment, and imposes no requirements about the consequence of doing so, which would suggest that such an action would invoke Undefined Behavior. On the other hand, the Standard also imposes no requirements upon what happens if a program returns EXIT_SUCCESS, nor what happens if it returns EXIT_FAILURE, nor if it returns some other value. Thus, all such actions could be viewed as invoking Undefined Behavior. On the other hand, viewing things in such fashion would make it impossible for any program which terminates to be Strictly Conforming.
I think the most reasonable way of interpreting the Standard would be to say that a program whose execution falls through the end of main() waives any control it might have had to affect what the execution environment does once it terminates. If all courses of action the host environment could perform after a program exits would be equally acceptable, a program's failure to do anything to influence which course of action is taken would not be a defect.
In considering whether a program that fails to specify a return value, or any program for that matter, is "Strictly Conforming", one cannot merely examine the source text, but must also consider the application requirements. If one needs a program to output the characters x and y once each, in some order, the following would be a strictly conforming program that accomplishes that:
#include <stdio.h>
int outputx(void) { return printf("x"); }
int outputy(void) { return printf("y"); }
int main(void)
{
return outputx() + outputy() && printf("\n") && 0;
}
If, on the other hand, one need a program to output "xy", the above would not be a strictly conforming program for that purpose. Thus, I would say that if the application requirements for some task specify that a program must use its return value to influence the host environment, a program that falls through the end of main would not be a Strictly Conforming C Program to accomplish that task. If, however, such influence over the host environment is not part of the application requirements for a task, then a Strictly Conforming C Program could waive such control.
Citations below:
From N1570 section 4 pararaph 7:
A conforming program is one that is acceptable to a conforming implementation. (*) 5) Strictly conforming programs are intended to be maximally portable among conforming implementations. Conforming programs may depend upon nonportable features of a conforming implementation.
Undefined Behavior is defined in 3.4.3:
behavior, upon use of a nonportable or erroneous program construct or of erroneous data, for which this International Standard imposes no requirements
From the C99 Rationale, talking about the definition of conformance [emphasis original]:
A strictly conforming program is another term for a maximally portable program. The goal is to give the programmer a fighting chance to make powerful C programs that are also highly portable, without seeming to demean perfectly useful C programs that happen not to be portable, thus the adverb strictly.
The fact that a program exits without setting a return value may make it non-portable, but the Standard deliberately avoids "demeaning" non-portable programs by calling them non-conforming.
No, the programs in the K&R book were never conforming programs 1) (C17 4/7) under any verison of the standard.
In C90 (ISO 9899:1990), the code invoked undefined behavior because of the missing return statement.
In C99 (ISO 9899:1999) and beyond, the code won't compile because of the implicit int.
Sources below.
Regarding implicit int, one major difference in function return types between C90 and latter versions can be found here:
C90 6.7.1 Function definitions
The return type of a function shall be void or an object type other than array.
/--/
If the declarator includes an idenfifier list, the types of the parameters may be declared in a following declaration list. Any parameter that is not declared has type int.
C17 6.9.1 Function definitions
The return type of a function shall be void or a complete object type other than array
type.
/--/
If the
declarator includes an identifier list, the types of the parameters shall be declared in a
following declaration list. In either case, the type of each parameter is adjusted as
described in 6.7.6.3 for a parameter type list; the resulting type shall be a complete object
type.
The main difference being the "complete object type" wording, the definition of complete object type being one of the basic types or a pointer to one (C17 6.2.5). We can conclude that implicit int was allowed in C90 both as the return type or as part of a (non-prototype) parameter list.
Regarding no return statement, this text was always there for general functions:
C90 6.6.6.4
If a return statement without an expression is executed and the value of the function call is used by the caller, the behavior is undefined. Reaching the } that terminates a function is equivalent to executing a return statement without an expression.
C17 6.9.1/12
If the } that terminates a function is reached, and the value of the function call is used by
the caller, the behavior is undefined.
However, main() is a special case and an exception was added in C99:
C99 5.1.2.2.3
If the return type of the main function is a type compatible with int, a return from the
initial call to the main function is equivalent to calling the exit function with the value
returned by the main function as its argument; reaching the } that terminates the
main function returns a value of 0.
Whereas in C90, the equivalent text says:
C90 5.1.2.2.3
A return from the initial call to the main function is equivalent to calling the exit function
with the value returned by the main function as its argument. If the main function executes a
return that specifies no value, the termination status returned to the host environment is undefined.
Regarding empty parameter lists, it has been marked as obsolescent from C90 to C17. See future language directions, for example C17 6.11 (or C90 6.9, identical text):
6.11.6 Function declarators
The use of function declarators with empty parentheses (not prototype-format parameter type
declarators) is an obsolescent feature.
6.11.7 Function definitions
The use of function definitions with separate parameter identifier and declaration lists (not prototypeformat
parameter type and identifier declarators) is an obsolescent feature.
This does however not mean that code using the feature isn't conforming, at least up to ISO 9899:2018. It's simply not recommended practice, and was not recommended practice at the point where K&R 2nd edition was released either.
1) C17 from chp 4:
A conforming implementation may have extensions (including
additional library functions), provided they do not alter the behavior of any strictly
conforming program.
A conforming program is one that is acceptable to a conforming implementation.
A strictly conforming program shall use only those features of the language and library
specified in this International Standard. It shall not produce output dependent on any
unspecified, undefined, or implementation-defined behavior, and shall not exceed any minimum implementation limit.
This means that a conforming program may use features of a conforming implementation that are non-portable, but it may not alter the behavior of a strictly conforming program by for example invoking undefined behavior explicitly listed as such in the standard.
I compiled the program
#include <stdio.h>
main()
{
printf("hello, world\n");
}
under two well-regarded compilers, gcc and clang. Just for fun I added the --pedantic option also. As far as I know both of these compilers would be considered "conforming", and I believe that's one of the things their authors certainly strive for.
Both compilers produced an executable which printed hello, world. Under the definition that
A conforming program is one that is acceptable to a conforming implementation
, I conclude that the program is conforming.
I pass no judgement on the question of whether the program would have been conforming under C89.
Although I have not studied the code examples in K&R2 in some years, I believe that most/all of the rest of them are similarly conforming, despite various pedagogical or other shortcuts which might render them not strictly conforming.
Some quotes:
"The first edition, published February 22, 1978, was the first widely available book on the C programming language. Its version of C is sometimes termed K&R C (after the book's authors), often to distinguish this early version from the later version of C standardized as ANSI C."
(source Wikipedia)
In other words, K&R edition 1 predates any official C standards. At the time the only specification was "The C Reference Manual" by Dennis M. Ritchie.
"In April 1988, the second edition of the book was published, updated to cover the changes to the language resulting from the then-new ANSI C standard, particularly with the inclusion of reference material on standard libraries."
(source Wikipedia)
In other words, K&R edition 2 was "aligned with" the first official ANSI C standard otherwise known as C89.
However, at the time K&R edition 2 was published, C89 was not yet complete. According the the Wikipedia page on ANSI C.
"In 1983, the American National Standards Institute formed a committee, X3J11, to establish a standard specification of C. In 1985, the first Standard Draft was released, sometimes referred to as C85. In 1986, another Draft Standard was released, sometimes referred to as C86. The prerelease Standard C was published in 1988, and sometimes referred to as C88."
(source Wikipedia)
Thus, they may be differences between what K&R says and the ANSI C standard.

What does it mean by C requires "all variables to be initialized at the beginning of the scope"

I'm learning about the difference between C++ and C and one of them is unlike C++, C does not allow a variable to be defined anywhere except at the beginning of the scope (from Thinking in C++)
The author gave an example where it would work in C++ but not in C, saying that retval has to be defined before the cout
//: C06:DefineInitialize.cpp
// Defining variables anywhere
#include "../require.h"
#include <iostream>
#include <string>
using namespace std;
class G {
int i;
public:
G(int ii);
};
G::G(int ii) { i = ii; }
int main() {
cout << "initialization value? ";
int retval = 0;
cin >> retval;
require(retval != 0);
int y = retval + 3;
G g(y);
} ///:~
However when I run this code, which is similar to test, it still works?
#include <stdlib.h>
#include <stdio.h>
int main() {
printf("Hello\n");
int i;
i = 0;
exit(0);
}
C does not allow a variable to be defined anywhere except at the beginning of the scope
This hasn't been true for 20 years, but 20 years ago is when Thinking In C++ was last revised.
When someone says "I'm programming in C" the question is then "which version of C?" There is a standard, but it has had several major revisions, and many non-standard extensions.
Prior to 1999, C required you to declare variables only at the start of the scope. In 1999 the standard changed to allow variable declarations anywhere you like.
C compilers can be configured to comply to various standards and extensions. Many C compilers will default to C99 or C11, often with extensions. The very common clang compiler currently defaults to C11 with GNU extensions. You can select which standard with -std. For example, if you want to see your code in C89: clang -std=c99.
Compiling your code using the C89 standard Thinking In C++ would have been written to, and because -Wall doesn't mean "all warnings" enough flags to get the compiler to issue warnings, gives us the warning we expect.
$ cc -Wall -Wextra -pedantic -std=c89 test.c
test.c:6:9: warning: ISO C90 forbids mixing declarations and code [-Wdeclaration-after-statement]
int i;
^
1 warning generated.
With -std=c99 there's no problem.
$ cc -Wall -Wextra -pedantic -std=c99 test.c
When learning C, I'd strongly advise to avoid extensions, write to the standard, and turn on lots of warnings. -std=c11 is a good choice for the moment.
A problem with a lot of material on C and C++ is the C standards have changed over the last 40+ years, and a lot of material is out of date.
Here's a brief history of the fault lines.
1978 introduces K&R C. Named for Brian Kernighan and Dennis Ritchie who authored The C Programming Language aka "K&R". A this point there was no standard for C.
You'll still see this legacy in K&R style function declarations where the type is separate
int foo(s, f, b)
char* s;
float f;
struct Baz * b;
{
return 5;
}
1989 brings the first C standard which goes by many names: ANSI C / ISO C / C89. Many features are added. Function prototypes as we know them today are introduced. Today, just about every C compiler will comply to C89.
And yes, variables must be defined at the beginning of the scope. This legacy carries on even in material produced today resulting in awkward blocks of declarations far away from where they're used.
1999 brings a major revision of the standard, C99. This adds inline functions, more types, variable-length arrays, one-line comments //, and the ability to declare variables anywhere you like.
The process of C compilers adopting C99 has been very slow and is still ongoing. But you can definitely rely on declaring variables where you like.
2011 brings C11 and 2018 brings C17 / C18. It's not relevant to go into their changes to the standard here, just be aware of them.
Finally, there are non-standard extensions. The major ones are POSIX, GCC, and Microsoft extensions. Many compilers implement POSIX extensions. Some compilers, such as clang, have adopted GCC extensions. While Microsoft extensions are typically only available for Microsoft compilers and often conflict with the standard. Be aware that a lot of material will use extensions without telling you resulting in code that only works on specific compilers.
C++ underwent a similar process of standardization, revision, and extension which is still ongoing. C++'s major versions are C++98, C++03, C++11, C++14, and C++17 and also have Gnu and Microsoft extensions.
Thinking In C++ was first published in 1995, so it would be written to C89 and non-standard C++. Its 2nd edition was published in 2000, just after C99. That's unlikely to have been enough time for the author to be comfortable with C99 conventions, and their readers would likely still be using C compilers which used C89 or even K&R standards.
Adoption of C and C++ standards by compilers can be very, very slow. You will still find a lot of material on C is written to C89.

C function calls: Understanding the "implicit int" rule

If "a function" were compiled separately, the mismatch would not be detected, "the function" would return a double that main would treat as an int... In the light of what we have said about how declarations must match definitions this might seems surprising. The reason a mismatch can happen is that if there is no function prototype, a function is implicitly declared by its first appearance in an expression, such as
sum += "the function"(line);
If a name that has not been previously declared occurs in an expression and is followed by a left parenthesis, it is declared by context to be a function name, the function is assumed to return an int, and nothing is assumed about its arguments.
I apologize beforehand for the ambiguous question, but what does this mean?
By the way this is page 73 chapter 4.3 from Brian W. Kernighan and Dennis M. Ritchie's C Programming Language book, 2nd edition.
K&R2 covers the 1989/1990 version of the language. The current ISO C standard, published in 1999 2011, drops the "implicit int" rule, and requires a visible declaration for any function you call. Compilers don't necessarily enforce this by default, but you should be able to request more stringent warnings -- and you definitely should. In well-written new code, the rule is irrelevant (but it is necessary to understand it).
An example: the standard sqrt() function is declared in <math.h>:
double sqrt(double);
If you write a call without the required #include <math.h>:
double x = 64.0;
double y = sqrt(x);
a C90 compiler will assume that sqrt returns int -- and it will generate code to convert the result from int to double. The result will be garbage, or perhaps a crash.
(You could manually declare sqrt yourself, but that's the wrong solution.)
So don't do that. Always include whatever header is required for any function you call. You might get away with calling an undeclared function if it does return int (and if your compiler doesn't enforce strict C99 or C11 semantics, and if a few other conditions are satisfied), but there's no good reason to do so.
Understanding the "implicit int" rule is still useful for understanding the behavior of old or poorly written code, but you should never depend on it in new code.
Function prototypes were introduced into the language late.
Before prototypes, the compiler would assume that every argument passed to every unknown function should be passed as an integer and would assume that the return value was also an integer.
This worked fine for the few cases where it was correct, but meant people had to write programs in an awkward order so that functions would never rely on unknown functions that did not match this expectation.
When prototypes were introduced into C89 (aka ANSI C or ISO C), the prototypes allow the compiler to know exactly what types of arguments are expected and what types of results will be returned.
It is strongly recommended that you use function prototypes for all new code; when working on an entirely old code base, the prototypes might be harmful. (Or, if the code must be compilable on a pre-ANSI C compiler, then you might wish to leave off the prototypes so it can be built on the ancient software. gcc is the only place I've seen this in a long time.)
It's just stating that if the compiler comes across code that calls an unknown function, then it implicitly treats it as if it had already seen a declared prototype of the form int unknown();

C1x: When will it land, what to expect?

C99 still isn't supported by many compilers, and much of the focus is now on C++, and its upcoming standard C++1x.
I'm curious as to what C will "get" in its next standard, when it will get it, and how it will keep C competitive. C and C++ are known to feed on one another's improvements, will C be feeding on the C++1x standard?
What can I look forward to in C's future?
The ISO/IEC 9899:2011 standard, aka C11, was published in December 2011.
The latest draft is N1570; I'm not aware of any differences between it and the final standard. There's already a Technical Corrigendum fixing an oversight in the specification of __STDC_VERSION__ (now 201112L) and the optional __STDC_LIB_EXT1__ (now 201112L).
I was typing a list of of features, but noticed the Wikipedia page on C1X has a pretty complete listing of all proposed changes.
On the ISO C working group posts 'after meeting' mailings on their website. One of the more interesting is this Editor's Report.
Here's a summary from the Wikipedia page:
Alignment specification (_Align specifier, alignof operator, aligned_alloc function)
Multithreading support (_Thread_local storage-class specifier, <threads.h> header including thread creation/management functions, mutex, condition variable and thread-specific storage functionality)
Improved Unicode support (char16_t and char32_t types for storing UTF-16/UTF-32 encoded data, including the corresponding u and U string literal prefixes and conversion functions in <uchar.h>)
Removal of the gets function
Bounds-checking interfaces (Annex K)
Analyzability features (Annex L)
I looks like gcc as of 4.6 is starting to look at C1x. They claim to have:
Static assertions (_Static_assert keyword)
Typedef redefinition
New macros in <float.h>
Anonymous structures and unions
Probably the best place to find the current status would be to look at the latest draft of the new version of the C standard. Warning: though it's coming directly from the committee, the server behind that link isn't always the most responsive...

What are the major differences between ANSI C and K&R C?

The Wikipedia article on ANSI C says:
One of the aims of the ANSI C standardization process was to produce a superset of K&R C (the first published standard), incorporating many of the unofficial features subsequently introduced. However, the standards committee also included several new features, such as function prototypes (borrowed from the C++ programming language), and a more capable preprocessor. The syntax for parameter declarations was also changed to reflect the C++ style.
That makes me think that there are differences. However, I didn't see a comparison between K&R C and ANSI C. Is there such a document? If not, what are the major differences?
EDIT: I believe the K&R book says "ANSI C" on the cover. At least I believe the version that I have at home does. So perhaps there isn't a difference anymore?
There may be some confusion here about what "K&R C" is. The term refers to the language as documented in the first edition of "The C Programming Language." Roughly speaking: the input language of the Bell Labs C compiler circa 1978.
Kernighan and Ritchie were involved in the ANSI standardization process. The "ANSI C" dialect superceded "K&R C" and subsequent editions of "The C Programming Language" adopt the ANSI conventions. "K&R C" is a "dead language," except to the extent that some compilers still accept legacy code.
Function prototypes were the most obvious change between K&R C and C89, but there were plenty of others. A lot of important work went into standardizing the C library, too. Even though the standard C library was a codification of existing practice, it codified multiple existing practices, which made it more difficult. P.J. Plauger's book, The Standard C Library, is a great reference, and also tells some of the behind-the-scenes details of why the library ended up the way it did.
The ANSI/ISO standard C is very similar to K&R C in most ways. It was intended that most existing C code should build on ANSI compilers without many changes. Crucially, though, in the pre-standard era, the semantics of the language were open to interpretation by each compiler vendor. ANSI C brought in a common description of language semantics which put all the compilers on an equal footing. It's easy to take this for granted now, some 20 years later, but this was a significant achievement.
For the most part, if you don't have a pre-standard C codebase to maintain, you should be glad you don't have to worry about it. If you do--or worse yet, if you're trying to bring an old program up to more modern standards--then you have my sympathies.
There are some minor differences, but I think later editions of K&R are for ANSI C, so there's no real difference anymore.
"C Classic" for lack of a better terms had a slightly different way of defining functions, i.e.
int f( p, q, r )
int p, float q, double r;
{
// Code goes here
}
I believe the other difference was function prototypes. Prototypes didn't have to - in fact they couldn't - take a list of arguments or types. In ANSI C they do.
function prototype.
constant & volatile qualifiers.
wide character support and internationalization.
permit function pointer to be used without dereferencing.
Another difference is that function return types and parameter types did not need to be defined. They would be assumed to be ints.
f(x)
{
return x + 1;
}
and
int f(x)
int x;
{
return x + 1;
}
are identical.
The major differences between ANSI C and K&R C are as follows:
function prototyping
support of the const and volatile data type qualifiers
support wide characters and internationalization
permit function pointers to be used without dereferencing
ANSI C adopts c++ function prototype technique where function definition and declaration include function names,arguments' data types, and return value data types. Function prototype enable ANSI C compiler to check for function calls in user programs that pass invalid numbers of arguments or incompatible arguments data types. These fix major weakness of the K&R C compiler.
Example: to declares a function foo and requires that foo take two arguments
unsigned long foo (char* fmt, double data)
{
/*body of foo */
}
FUNCTION PROTOTYPING:ANSI C adopts c++ function prototype technique where function definaton and declaration include function names,arguments t,data types and return value data types.function prototype enable ANSI ccompilers to check for function call in user program that passes invalid number number of argument or incompatiblle argument data types.these fix a major weakness of the K&R C compilers:invalid call in user program often passes compilation but cause program to crash when they are executed
The difference is:
Prototype
wide character support and internationalisation
Support for const and volatile keywords
permit function pointers to be used as dereferencing
A major difference nobody has yet mentioned is that before ANSI, C was defined largely by precedent rather than specification; in cases where certain operations would have predictable consequences on some platforms but not others (e.g. using relational operators on two unrelated pointers), precedent strongly favored making platform guarantees available to the programmer. For example:
On platforms which define a natural ranking among all pointers to all objects, application of the relational operators to arbitrary pointers could be relied upon to yield that ranking.
On platforms where the natural means of testing whether one pointer is "greater than" another never has any side-effect other than yielding a true or false value, application of the relational operators to arbitrary pointers could likewise be relied upon never to have any side-effects other than yielding a true or false value.
On platforms where two or more integer types shared the same size and representation, a pointer to any such integer type could be relied upon to read or write information of any other type with the same representation.
On two's-complement platforms where integer overflows naturally wrap silently, an operation involving an unsigned values smaller than "int" could be relied upon to behave as though the value was unsigned in cases where the result would be between INT_MAX+1u and UINT_MAX and it was not promoted to a larger type, nor used as the left operand of >>, nor either operand of /, %, or any comparison operator. Incidentally, the rationale for the Standard gives this as one of the reasons small unsigned types promote to signed.
Prior to C89, it was unclear to what lengths compilers for platforms where the above assumptions wouldn't naturally hold might be expected to go to uphold those assumptions anyway, but there was little doubt that compilers for platforms which could easily and cheaply uphold such assumptions should do so. The authors of the C89 Standard didn't bother to expressly say that because:
Compilers whose writers weren't being deliberately obtuse would continue doing such things when practical without having to be told (the rationale given for promoting small unsigned values to signed strongly reinforces this view).
The Standard only required implementations to be capable of running one possibly-contrived program without a stack overflow, and recognized that while an obtuse implementation could treat any other program as invoking Undefined Behavior but didn't think it was worth worrying about obtuse compiler writers writing implementations that were "conforming" but useless.
Although "C89" was interpreted contemporaneously as meaning "the language defined by C89, plus whatever additional features and guarantees the platform provides", the authors of gcc have been pushing an interpretation which excludes any features and guarantees beyond those mandated by C89.
The biggest single difference, I think, is function prototyping and the syntax for describing the types of function arguments.
Despite all the claims to the contary K&R was and is quite capable of providing any sort of stuff from low down close to the hardware on up.
The problem now is to find a compiler (preferably free) that can give a clean compile on a couple of millions of lines of K&R C without out having to mess with it.And running on something like a AMD multi core processor.
As far as I can see, having looked at the source of the GCC 4.x.x series there is no simple hack to reactivate the -traditional and -cpp-traditional lag functionality to their previous working state without without more effor than I am prepered to put in. And simpler to build a K&R pre-ansi compiler from scratch.

Resources