How many styles of writing functions are there in C? - c

So far, I know two styles:
/* 1st style */
int foo(int a) {
return a;
}
/* 2nd style */
int foo(a)
int a;
{
return a;
}
(I saw someone writing codes in the 2nd style. I was surprised at first but the 2nd style worked (under gcc as I tested). This made me curious and i wanted to ask this question.)

I won't call these styles, but languages variants (or dialects).
A coding style is a set of optional conventions that might not be followed. For instance, some coding styles request that macro names are all capitals (but your code will compile if you don't follow that rule).
Your "2nd style" is called Kernighan & Ritchie C. It is the old C defined in the late 1970s (in the very first edition of a famous book on C by Kernighan and Ritchie; subsequent editions have been conforming to later C standards). It is an obsolete language.
Current compilers are often following the C99 ISO standard (published in 1999), which has been replaced by the new C11 standard (published in 2011).
GCC compilers are accepting the C99 standard with the -std=c99 program argument. I strongly suggest to compile with gcc -Wall -std=c99; Recent GCC compilers (ie 4.6 and 4.7) are accepting -std=c11 IIRC for the newer standard C11.
Don't code today in the old Kernighan and Ritchie C dialect: it is obsolete, and less and less supported by compilers. IMHO C99 is a good standard to follow if you are cautious. And take profit of some of its features (in particular, ability to mix declarations and statements inside a block; older C dialects required to put all the declarations at the start of a block.).
The standard has progressed, in particular because it added features and is more precise w.r.t. to current systems and practices (e.g. multi-core processors)

There are (at least) two disadvantages when using the 2nd style:
If the function prototype is also missing, the compiler will not check for proper argument type. And, if I remember correctly, the number of arguments will also not be checked.
This (K&R, 1st Ed) style is valid only for backward compatibility ... someday, the C standard will (perhaps) disallow this style and your programs will halt.
Also, for better readability, you can put function's return type on its own line (especially useful with long-winded return types like unsigned long int and headers):
int
foo(int a)
{
return a;
}

The second style is the older style and is supported for backwards compatibility. I don't know of any other styles off the top of my head, but you should use the first (newer) style. The second (older) style was already deprecated when I started working with C (1994).

The 2nd one is K&R C style syntax, but it is obsolete now. Checkout #Basile Starynkevitch's answer.

Related

Why don't Kernighan and Ritchie include int for the main functions? [duplicate]

This question already has answers here:
C function calls: Understanding the "implicit int" rule
(3 answers)
implicit int in c language
(2 answers)
Closed 4 years ago.
Each example that is done is completed with main omitting "int". Why is this and why does it still compile the same without it. Are C compilers created with int implied?
At the time Kernighan and Ritchie wrote, int was the default type. Computing was in general simpler than it is today and had more constrained resources. Languages were designed to satisfy simple needs. And they were not completely designed; they evolved from scratch, with changes contributed by different people in different places with different things in mind. So the C at the time was not designed with modern ideas about type safety or being strict about grammar to reduce the frequency of errors. Some features put into it were things that people thought were nice and immediately useful, not necessarily things that had been analyzed for their deeper effects or future effects.
One motivating factor in language design was brevity. int was a frequently used type, because it was intended to be the “natural” type for whatever machine was being used, so it was convenient to make it the default. Therefore, many declarations defaulted to int, so it was not required.
Modern compilers may still allow the absence of int so that old code can still be compiled. Often, such compilers have switches to be more strict about which language dialect(s) they accept. For example, in GCC and Clang, you can specify -std=c11 to request stricter conformance to the C standard. (And -std=c17 or -std=c18 may be available soon.) When writing new programs, you should use such switches so that your compiler will issue diagnostics about code that does not conform to modern standards. This will help write code that better conforms to the current standard, has more well-defined behavior, and is more likely to remain useful longer into the future.
Yep. Blow the dust off your white book, and you will see exactly that. K&R is what it is and it happened well before ANSI C.
Another difference is function arguments. If i recall correctly it was done like this
some_function_name(a,b)
int a;
long float b;
{
printf("%d %lf\n", a, b);
}
C and C++ have mutated a lot over the decades.

Using non-standard functions in Code::Blocks

I got this book "Beginning C" by Ivor Horton and I'm half way through it and I like it; so far so good. I use Code::Blocks on Windows as my IDE, and now I've run into the problem I cannot solve for about 3 days now.
The author mentions some "optional" functions in <string.h>, like strnlen_s(), and also says that these are available in the new standard — C11 (the book is from 2013; I don't know how new C11 actually is), and he also gives a piece of code that will determine "whether the standard library that comes with your C compiler supports these optional functions".
This is the code:
#include <stdio.h>
int main(void)
{
#if defined __STDC_LIB_EXT1__
printf("Optional functions are defined.\n");
#else
printf("Optional functions are not defined.\n");
#endif
return 0;
}
So I run the code to check if GCC in Code::Blocks does and determine that it doesn't. The book didn't recommend the compiler nor the IDE; I picked up Code::Blocks with GCC on my own, since that's what I do my exams in at college, so I figured I should get familiar with the environment.
The thing is, I have no idea how to "fix" this, since strnlen() doesn't work, strnlen_s() doesn't work, and bunch of others, and I can't really continue through a book. Not that I need them, or that I can't do it any other way (strlen() works just fine) but it would be nice to know how to use non-standard functions.
Up to date versions of GCC certainly do support C11, you need to enable it with the compiler flag -std=c11.
I presume you're using some flavour of MinGW with Code::Blocks - I recommend using MinGW-W64 as it is actively maintained and very up to date.
Also, bundled toolchains of MinGW-W64's gcc are available at TDM-GCC.
The Code::Blocks IDE itself doesn't care which version of C you're using, that doesn't affect what libraries you have available.
You are speaking of the optional Annex K Microsoft pushed through.
K.2 Scope
1 This annex specifies a series of optional extensions that can be useful in the mitigation of
security vulnerabilities in programs, and comprise new functions, macros, and types
declared or defined in existing standard headers.
2 An implementation that defines __STDC_LIB_EXT1__ shall conform to the
specifications in this annex.380)
3 Subclause K.3 should be read as if it were merged into the parallel structure of named
subclauses of clause 7.
It is generally seen as deeply flawed, and Microsoft trying to force it's use as a severe nuisance.
That's especially the case as they are the only major player implementing them, and their versions are non-conformant.
glibc with gcc for example provide most supposed advantages of that annex without introducing new functions, discouraging use of half the standard-library and forcing such a cumbersome API on programmers.
You might want to read the C tag-wiki, and especially grab a draft of the C11 standard (which is from 2011, as the name should imply).
The optional Annex K from the C11 Standard is not widely adopted yet (see Deduplicator's comment below). For instance as of February 2015 it hasn't been merged into glibc.
The good news is that you might try an alternative compiler. For instance Pelles C for Windows is a modified LCC with enhanced support for newest C11 features (like atomics and C11 threads model, that I believe are also mentioned in your book). Here is some basic program, that compiles and runs in it:
#include <stdio.h>
#include <string.h>
int main(void)
{
#if defined __STDC_LIB_EXT1__
printf("Optional functions are defined.\n");
#else
printf("Optional functions are not defined.\n");
#endif
char *str = "Hello Annex K";
printf("%zu\n", strnlen_s(str, 5));
return 0;
}
Output is:
Optional functions are defined.
5
Press any key to continue...

Which version of C is more appropriate for students to learn- C89/90 or C99?

I'm looking into learning C basics and syntax before beginning Systems Programming next month. When doing some reading, I came across the C89/99 standards. According to Wikipedia,
C99 introduced several new features,
including inline functions, several
new data types (including long long
int and a complex type to represent
complex numbers), variable-length
arrays, support for variadic macros
(macros of variable arity) and support
for one-line comments beginning with
//, as in BCPL or C++. Many of these
had already been implemented as
extensions in several C compilers.
C99 is for the most part backward
compatible with C90, but is stricter
in some ways; in particular, a
declaration that lacks a type
specifier no longer has int
implicitly assumed. A standard macro
STDC_VERSION is defined with value 199901L to indicate that C99 support
is available. GCC, Sun Studio and
other compilers now support many or
all of the new features of C99.
I borrowed a copy of K&R, 2nd Edition, and it uses the C89 standard. For a student, does the use of C89 invalidate some subjects covered in K&R, and if so, what should I look out for?
There is no reason to learn C89 or C90 over C99- it's been very literally superseded. It's easy to find C99 compilers and there's no reason whatsoever to learn an earlier standard.
This doesn't mean that your professor won't force C89 upon you. From the various questions posted here marked homework, I get the feeling that many, many C (and, unfortunately, C++) courses haven't moved on since C89.
From the perspective of a starting student, the chances are that you won't really notice the difference- there's plenty of C that's both C99 and C89/90 to be covered.
Use the C99 standard, it's newer and has more features. Particularly useful may be the bool type in <stdbool.h> and the int32_t etc. family of types; the latter prevents a lot of unportable code that relies on ints having a certain size. AFAIK, it doesn't invalidate K&R, though some example programs may be written in a slightly different style now.
Note that some compilers still don't support C99 properly. I believe that GCC still requires the use of a -std=c99 flag to enable it; many Unix/Linux systems have a c99 command that wraps GCC and enables C99.
The same goes for many university professors. I surprised mine by handing in a program that used bool in my freshman year. He'd never heard of that type in C :)
While I generally agree with the others, it is worth noting that K&R is such a good book that it might be worth learning C from it and then updating your knowledge as you read about the C99 standard.
If you are at student level you probably won't even notice the differences.
Yes, it's a bit odd that you can get a loud consensus that K&R is a great C book, and also a loud consensus that C99 is the correct/current/best version of C. The two positions are incompatible - even if K&R is the best book available to learn "C meaning C99", that just implies the rest are rubbish, or are also hopelessly outdated.
I would advise learning and using C99, but keeping an eye to C89 as you do so. If you use a compiler that has both C89 and C99 compliant modes, then you can write a few bits of C89 just to get an idea of the differences. Then if you ever need to write some code intended to be portable to places that C99 doesn't go, you'll know what to do. If you never have to write any such code, then you've wasted perhaps a day.
Writing C89 properly is actually surprisingly difficult, because getting hold of a copy of the C89 standard is difficult. So, C99 if you can, C89 if for some odd reason you have to, and have some awareness what the difference is. Maybe use K&R to cover the very basics, but get a look at some idiomatic C99 as soon as possible.
As for specific issues to be aware of when reading K&R: there's a list of major changes in the foreword of the standard (http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1256.pdf), although the details aren't laid out there. A lot of them are new features added to C99, so it's not that K&R is wrong, it just may not always use the best tools for a given job. Some of them are quite fiddly things where you should probably consult the standard if you need the details anyway. The rest are things removed from C89, that usually a C99 compiler will tell you about as and when you try to use them.
As a student, that doesn't influence you so much. But if possible, you should find a new C book which covers C99
The term "C89" describes two very different languages:
The language that programmers in 1989 thought the Committee was describing in places where the Standard was ambiguous, and which supported features that were common in pre-existing implementations.
The language that the Committee has since decided that it wanted to have described, which threw compatibility with existing functionality out the
window.
C99 "clarifies" ambiguous parts of the standard by saying that they meant
to have the Standard interpreted in a way that would have broken a substantial
fraction of existing code and made it impossible to perform many tasks as
efficiently as they had been performed in C before 1989.
The right language to program in, for many applications, would be the superset of pre-Standard C, C89, C99, and C11. It's important, however, that anyone programming in that language be clear that they're using that language rather than a shrinking subset which favors speed over reliability.
While I think it's beneficial to know which features are more recent and less likely to be supported by obscure (or intentionally-broken, like MSVC) compilers, there are a few C99 features that you should absolutely use:
snprintf: This is the definitive function for safe and clean string assembly in C. If your compiler is missing it, you can either replace the whole printf subsystem (probably a good idea since most implementations with missing snprintf are also full of (often intentional) bugs in printf behavior), or wrap tmpfile/fprintf/fread/fclose.
stdint.h: If you need fixed-size types (16/32/64-bit), use the standard names int16_t, uint16_t, int32_t, etc. Do not invent your own, and absolutely don't use system-specific ones like INT64 or u32. It just makes your code ugly and hard to integrate and reuse. If your compiler is missing stdint.h, just drop in your own to define the types in terms of the correct-for-your-platform types.
Specifically uint64_t, in place of int foo[2]; or struct { int lo, int hi; } foo; or other hideous legacy hacks to work with 64-bit numbers. Any sane compiler even without C99 support has its own 64-bit types you can use to define int64_t and uint64_t.

What has changed since “The C Programming Language”

My experience in C is mostly from second edition of The C Programming language which is a very old book. What has changed in C since it was released, what obsolete or deprecated functions should I avoid?
You can also look at the 'C' specifications that have come out since (like C99). These specs will indicate what they have added/removed/changed in relation to the previous standard.
http://en.wikipedia.org/wiki/C_%28programming_language%29
http://en.wikipedia.org/wiki/C99
http://en.wikipedia.org/wiki/C89_%28C_version%29
If you want to see what the future holds for 'C', have a look at C1X, which is the upcoming 'C' standard.
http://en.wikipedia.org/wiki/C1x
If you can grab a copy of the ISO C99 standard, the Foreword includes a nice 2-page list of major changes since C90.
Not very much has changed. For most practical purposes, the language described in K&R2 is still the one to use.
There has been a new C standard in 1999, but that has not been adopted as successfully and widely as the 1989 version of the standard (which K&R2 also describes).
The most important changes in C99 that could break existing programs are:
The implicit assumption of type int in declarations has been removed. Just make sure you always explicitly specify the types of your functions and variables.
Calling a function without a prior declaration is deprecated. Just make sure you declare all functions before use, preferably with a prototype.
Both of these were hold-overs from pre-standard days and have been considered bad-practice for a long time.
The one function to avoid is (and has always been) gets().

Is C99 backward compatible with C89?

I'm used to old-style C and and have just recently started to explore c99 features. I've just one question: Will my program compile successfully if I use c99 in my program, the c99 flag with gcc and link it with prior c99 libraries?
So, should I stick to old C89 or evolve?
I believe that they are compatible in that respect. That is as long as the stuff that you are compiling against doesn't step on any of the new goodies. For instance, if the old code contains enum bool { false, true }; then you are in trouble. As a similar dinosaur, I am slowly embracing the wonderful new world of C99. After all, it has only been out there lurking for about 10 years now ;)
You should evolve. Thanks for listening :-)
Actually, I'll expand on that.
You're right that C99 has been around for quite a while. You should (in my opinion) be using that standard for anything other than legacy code (where you just fix bugs rather than add new features). It's probably not worth it for legacy code but you should (as with all business decisions) do your own cost/benefit analysis.
I'm already ensuring my new code is compatible with C1x - while I'm not using any of the new features yet, I try to make sure it won't break.
As to what code to look out for, the authors of the standards take backward compatibility very seriously. Their job was not ever to design a new language, it was to codify existing practices.
The phase they're in at the moment allows them some more latitude in updating the language but they still follow the Hippocratic oath in terms of their output: "first of all, do no harm".
Generally, if your code is broken with a new standard, the compiler is forced to tell you. So simply compiling your code base will be an excellent start. However, if you read the C99 rationale document, you'll see the phrase "quiet change" appear - this is what you need to watch out for.
These are behavioral changes in the compiler that you don't need to be informed about and may be the source of much angst and gnashing of teeth if your application starts acting strange. Don't worry about the "quiet change in c89" bits - if they were a problerm, you would have already been bitten by them.
That document, by the way, is an excellent read to understand why the actual standard says what it says.
Some C89 features are not valid C99
Arguably, those features exist only for historical reasons, and should not be used in modern C89 code, but they do exist.
The C99 N1256 standard draft foreword paragraph 5 compares C99 to older revisions, and is a good place to start searching for those incompatibilities, even though it has by far more extensions than restrictions.
Implicit int return and variable types
Mentioned by Lutz in a comment, e.g. the following are valid C89:
static i;
f() { return 1; }
but not C99, in which you have to write:
static int i;
int f() { return 1; }
This also precludes calling functions without prototypes in C99: Are prototypes required for all functions in C89, C90 or C99?
n1256 says:
remove implicit int
Return without expression for non void function
Valid C89, invalid C99:
int f() { return; }
I think in C89 it returns an implementation defined value. n1256 says:
return without expression not permitted in function that returns a value
Integer division with negative operand
C89: rounds to an implementation defined direction
C99: rounds to 0
So if your compiler rounded to -inf, and you relied on that implementation defined behavior, your compiler is now forced to break your code on C99.
https://stackoverflow.com/a/3604984/895245
n1256 says:
reliable integer division
Windows compatibility
One major practical concern is being able to compile in Windows, since Microsoft does not intend to implement C99 fully too soon.
This is for example why libgit2 limits allowed C99 features.
Respectfully: Try it and find out. :-)
Though, keep in mind that even if you need to fix a few minior compiling differences, moving up is probably worth it.
If you don't violate the explicit C99 features,a c90 code will work fine c99 flag with another prior c99 libraries.
But there are some dos based libraries in C89 like ,, that will certainly not work.
C99 is much flexible so feel free to migrate :-)
The calling conventions between C libraries hasn't changed in ages, and in fact, I'm not sure it ever has.
Operating systems at this point rely heavily on the C calling conventions since the C APIs tend to be the glue between the pieces of the OS.
So, basically the answer is "Yes, the binaries will be backwards compatible. No, naturally, code using C99 features can't later be compiled with a non-C99 compiler."
It's intended to be backwards compatible. It formalizes extensions that many vendors have already implemented. It's possible, maybe even probable, that a well written program won't have any issues when compiling with C99.
In my experience, recompiling some modules and not others to save time... wastes a lot of time. Usually there is some easily overlooked detail that needs the new compiler to make it all compatible.
There are a few parts of the C89 Standard which are ambiguously written, and depending upon how one interprets the rule about types of pointers and the objects they're accessing, the Standard may be viewed as describing one of two very different languages--one of which is semantically much more powerful and consequently usable in a wider range of fields, and one of which allows more opportunities for compiler-based optimization. The C99 Standard "clarified" the rule to make clear that it makes no effort to mandate compatibility with the former language, even though it was overwhelmingly favored in many fields; it also treats as undefined some things that were defined in C89 but only because the C89 rules weren't written precisely enough to forbid them (e.g. the use of memcpy for type punning in cases where the destination has heap duration).
C99 may thus be compatible with the language that its authors thought was described by C89, but is not compatible with the language that was processed by most C89 compilers throughout the 1990s.

Resources