C header file macro - c

ML_EXTERN _ML_AB ML_FUNC(M_Process)
#ifdef ML_ARGS_LIST
(
ML_INT i
);
#endif
If I am creating interfaces for vendors (who need to implement those interfaces) why would I have the above declaration in my API? I think it's called a macro - why wouldn't I just declare the function like "normal": int M_Process();? What is the difference and what do the various parts of the declaration above do?

Macros like these are largely added for compatibility with different C compilers. You'll find this in code from the 1980s and 1990s for compilers that were either K&R compliant (predated ANSI C), or lacked various features.
You'd have to read the macro definitions to be absolutely sure, but there are some common patterns with this kind of thing. The macros break into a few types:
Linkage conventions
By defining ML_EXTERN, the API can support different calling conventions (e.g., stdcall, fastcall, etc.), or different kinds of external linkage (e.g., Microsoft's dllexport). This provides compiler compatibility and cross-platform compatibility while not substantially affecting code readability.
Consistent type sizes
The number of bits in types like int, short, and long are platform-dependent, although there are some restrictions. Many libraries that predate the standard inttypes.h and stdint.h headers use macros or typedefs to ensure that their types will be the same size on all platforms. ML_INT is either a macro or a typedef to an appropriately-sized integer type.
Prototype support
Many compilers in the 80s were built to the K&R standard for C and do not support ANSI function prototypes. The ML_FUNC and ML_ARGS_LIST macros address this.
If prototypes are supported, then ML_ARGS_LIST will be defined and ML_FUNC(x) would expand to x. The full expansion, ignoring the ML_EXTERN and _ML_AB would be:
M_Process(ML_INT i)
If the compiler does not support prototypes, then ML_FUNC(x) will expand to x() and the prototype argument list will be ommited. The full expansion would be:
M_Process()
In K&R C, all arguments are coerced according to the rules, and parameter variables are defined where the function is defined, not declared. See Wikipedia's entry on C for some details of the differences in C versions.

Related

C2x: what is the rationale for alignas to be a keyword rather than a macro?

In 1990 P.J. Plauger wrote (emphasis added):
Standard C offers you an additional level of security, however. It is a level
offered in no other language standard that I know. It promises that if you avoid
certain sets of names, you will experience no collisions. Thus, Standard C makes
it that much easier to write highly portable applications.
In C11 keyword _Alignas (for example) was introduced with the accompanying macro alignas defined in <stdalign.h>. Here we see that the keyword is _Alignas, not alignas (since in pre-C11 alignas is not reserved). Hence, there is no collision with possible user-defined alignas.
However, in C2x the alignas is a keyword and <stdalign.h> provides no content (and C2x says nothing about __alignas_is_defined macro -- defect?). It means that in C2x any pre-C2x code containing user-defined alignas will cause semantics violation and, hence, breaks backward compatibility.
Questions:
Does it mean that since C2x the "you will experience no collisions" does not hold any longer?
What is the rationale for alignas (for example) to be a keyword rather than a macro?
The proposal for these change (available at https://open-std.org/JTC1/SC22/WG14/www/docs/n2934.pdf) argues that the naming strategy in regards to new keywords has already been inconsistent with previous standard versions:
some were integrated using non-reserved names (const, inline) others
were integrated in an underscore-capitalized form. For some of them, the use of the lower-case form then
is ensured via a set of library header files.
Further using the same keyword naming as C++ (for compatibility purposes) is also mentioned in the proposal, since some keywords originated in that language and were later added to C.

C compiler - list __builtin_ types

here is a small (and working) C code:
typedef __builtin_va_list __va_list;
int main() {
return 0;
}
I found an answer, how gcc found the base type:
Pycparser not working on preprocessed code
But how can I list all of the __builtin_ "base" types, which not defined explicitly?
Thanks,
a.
how can I list all of the __builtin_ "base" types, which not defined explicitly?
TL;DR: there is no general-purpose way to do it.
The standard does not define any such types, so they can only be implementation-specific. In particular, C does not attribute any significance to the __builtin_ name prefix (though such identifiers are reserved), nor does it acknowledge that any types exist that are not derived from those it does define. Thus, for most purposes, the types you are asking about should be considered an implementation detail.
If there were a way to list implementation-specific built-in types, it would necessarily be implementation-specific itself. For example, you might be able to find a list such as you are after in the compiler's documentation. You could surely derive one from the compiler's own source code, if that's available to you. You could maybe extract strings from the compiler binary, and filter for a characteristic name pattern, such as strings starting with "__builtin_".
You could also consider parsing all the standard library headers (with the assumption that they are correct) to find undeclared types, though that's not guaranteed to find all the available types. Moreover, with some systems, for example GNU's, the C standard library (to which the headers belong) is separate from the compiler.

Is #define banned in industry standards?

I am a first year computer science student and my professor said #define is banned in the industry standards along with #if, #ifdef, #else, and a few other preprocessor directives. He used the word "banned" because of unexpected behaviour.
Is this accurate? If so why?
Are there, in fact, any standards which prohibit the use of these directives?
First I've heard of it.
No; #define and so on are widely used. Sometimes too widely used, but definitely used. There are places where the C standard mandates the use of macros — you can't avoid those easily. For example, §7.5 Errors <errno.h> says:
The macros are
EDOM
EILSEQ
ERANGE
which expand to integer constant expressions with type int, distinct positive values, and which are suitable for use in #if preprocessing directives; …
Given this, it is clear that not all industry standards prohibit the use of the C preprocessor macro directives. However, there are 'best practices' or 'coding guidelines' standards from various organizations that prescribe limits on the use of the C preprocessor, though none ban its use completely — it is an innate part of C and cannot be wholly avoided. Often, these standards are for people working in safety-critical areas.
One standard you could check the MISRA C (2012) standard; that tends to proscribe things, but even that recognizes that #define et al are sometimes needed (section 8.20, rules 20.1 through 20.14 cover the C preprocessor).
The NASA GSFC (Goddard Space Flight Center) C Coding Standards simply say:
Macros should be used only when necessary. Overuse of macros can make code harder to read and maintain because the code no longer reads or behaves like standard C.
The discussion after that introductory statement illustrates the acceptable use of function macros.
The CERT C Coding Standard has a number of guidelines about the use of the preprocessor, and implies that you should minimize the use of the preprocessor, but does not ban its use.
Stroustrup would like to make the preprocessor irrelevant in C++, but that hasn't happened yet. As Peter notes, some C++ standards, such as the JSF AV C++ Coding Standards (Joint Strike Fighter, Air Vehicle) from circa 2005, dictate minimal use of the C preprocessor. Essentially, the JSF AV C++ rules restrict it to #include and the #ifndef XYZ_H / #define XYZ_H / … / #endif dance that prevents multiple inclusions of a single header. C++ has some options that are not available in C — notably, better support for typed constants that can then be used in places where C does not allow them to be used. See also static const vs #define vs enum for a discussion of the issues there.
It is a good idea to minimize the use of the preprocessor — it is often abused at least as much as it is used (see the Boost preprocessor 'library' for illustrations of how far you can go with the C preprocessor).
Summary
The preprocessor is an integral part of C and #define and #if etc cannot be wholly avoided. The statement by the professor in the question is not generally valid: #define is banned in the industry standards along with #if, #ifdef, #else, and a few other macros is an over-statement at best, but might be supportable with explicit reference to specific industry standards (but the standards in question do not include ISO/IEC 9899:2011 — the C standard).
Note that David Hammen has provided information about one specific C coding standard — the JPL C Coding Standard — that prohibits a lot of things that many people use in C, including limiting the use of of the C preprocessor (and limiting the use of dynamic memory allocation, and prohibiting recursion — read it to see why, and decide whether those reasons are relevant to you).
No, use of macros is not banned.
In fact, use of #include guards in header files is one common technique that is often mandatory and encouraged by accepted coding guidelines. Some folks claim that #pragma once is an alternative to that, but the problem is that #pragma once - by definition, since pragmas are a hook provided by the standard for compiler-specific extensions - is non-standard, even if it is supported by a number of compilers.
That said, there are a number of industry guidelines and encouraged practices that actively discourage all usage of macros other than #include guards because of the problems macros introduce (not respecting scope, etc). In C++ development, use of macros is frowned upon even more strongly than in C development.
Discouraging use of something is not the same as banning it, since it is still possible to legitimately use it - for example, by documenting a justification.
Some coding standards may discourage or even forbid the use of #define to create function-like macros that take arguments, like
#define SQR(x) ((x)*(x))
because a) such macros are not type-safe, and b) somebody will inevitably write SQR(x++), which is bad juju.
Some standards may discourage or ban the use of #ifdefs for conditional compilation. For example, the following code uses conditional compilation to properly print out a size_t value. For C99 and later, you use the %zu conversion specifier; for C89 and earlier, you use %lu and cast the value to unsigned long:
#if __STDC_VERSION__ >= 199901L
# define SIZE_T_CAST
# define SIZE_T_FMT "%zu"
#else
# define SIZE_T_CAST (unsigned long)
# define SIZE_T_FMT "%lu"
#endif
...
printf( "sizeof foo = " SIZE_T_FMT "\n", SIZE_T_CAST sizeof foo );
Some standards may mandate that instead of doing this, you implement the module twice, once for C89 and earlier, once for C99 and later:
/* C89 version */
printf( "sizeof foo = %lu\n", (unsigned long) sizeof foo );
/* C99 version */
printf( "sizeof foo = %zu\n", sizeof foo );
and then let Make (or Ant, or whatever build tool you're using) deal with compiling and linking the correct version. For this example that would be ridiculous overkill, but I've seen code that was an untraceable rat's nest of #ifdefs that should have had that conditional code factored out into separate files.
However, I am not aware of any company or industry group that has banned the use of preprocessor statements outright.
Macros can not be "banned". The statement is nonsense. Literally.
For example, section 7.5 Errors <errno.h> of the C Standard requires the use of macros:
1 The header <errno.h> defines several macros, all relating to the reporting of error conditions.
2 The macros are
EDOM
EILSEQ
ERANGE
which expand to integer constant expressions with type int, distinct
positive values, and which are suitable for use in #if preprocessing
directives; and
errno
which expands to a modifiable lvalue that has type int and thread
local storage duration, the value of which is set to a positive error
number by several library functions. If a macro definition is
suppressed in order to access an actual object, or a program defines
an identifier with the name errno, the behavior is undefined.
So, not only are macros a required part of C, in some cases not using them results in undefined behavior.
No, #define is not banned. Misuse of #define, however, may be frowned upon.
For instance, you may use
#define DEBUG
in your code so that later on, you can designate parts of your code for conditional compilation using #ifdef DEBUG, for debug purposes only. I don't think anyone in his right mind would want to ban something like this. Macros defined using #define are also used extensively in portable programs, to enable/disable compilation of platform-specific code.
However, if you are using something like
#define PI 3.141592653589793
your teacher may rightfully point out that it is much better to declare PI as a constant with the appropriate type, e.g.,
const double PI = 3.141592653589793;
as it allows the compiler to do type checking when PI is used.
Similarly (as mentioned by John Bode above), the use of function-like macros may be disapproved of, especially in C++ where templates can be used. So instead of
#define SQ(X) ((X)*(X))
consider using
double SQ(double X) { return X * X; }
or, in C++, better yet,
template <typename T>T SQ(T X) { return X * X; }
Once again, the idea is that by using the facilities of the language instead of the preprocessor, you allow the compiler to type check and also (possibly) generate better code.
Once you have enough coding experience, you'll know exactly when it is appropriate to use #define. Until then, I think it is a good idea for your teacher to impose certain rules and coding standards, but preferably they themselves should know, and be able to explain, the reasons. A blanket ban on #define is nonsensical.
That's completely false, macros are heavily used in C. Beginners often use them badly but that's not a reason to ban them from industry. A classic bad usage is #define succesor(n) n + 1. If you expect 2 * successor(9) to give 20, then you're wrong because that expression will be translated as 2 * 9 + 1 i.e. 19 not 20. Use parenthesis to get the expected result.
No. It is not banned. And truth to be told, it is impossible to do non-trivial multi-platform code without it.
No your professor is wrong or you misheard something.
#define is a preprocessor macro, and preprocessor macros are needed for conditional compilation and some conventions, which aren't simply built in the C language. For example, in a recent C standard, namely C99, support for booleans had been added. But it's not supported "native" by the language, but by preprocessor #defines. See this reference to stdbool.h
Macros are used pretty heavily in GNU land C, and without conditional preprocessor commands there's be no way to properly handle multiple inclusions of the same source files, so that makes them seem like essential language features to me.
Maybe your class is actually on C++, which despite many people's failure to do so, should be distinguished from C as it is a different language, and I can't speak for macros there. Or maybe the professor meant he's banning them in his class. Anyhow I'm sure the SO community would be interested in hearing which standard he's talking about, since I'm pretty sure all C standards support the use of macros.
Contrary to all of the answers to date, the use of preprocessor directives is oftentimes banned in high-reliability computing. There are two exceptions to this, the use of which are mandated in such organizations. These are the #include directive, and the use of an include guard in a header file. These kinds of bans are more likely in C++ rather than in C.
Here's but one example: 16.1.1 Use the preprocessor only for implementing include guards, and including header files with include guards.
Another example, this time for C rather than C++: JPL Institutional Coding Standard for the C Programming Language . This C coding standard doesn't go quite so far as banning the use of the preprocessor completely, but it comes close. Specifically, it says
Rule 20 (preprocessor use)
Use of the C preprocessor shall be limited to file inclusion and simple macros. [Power of Ten Rule 8].
I'm neither condoning nor decrying those standards. But to say they don't exist is ludicrous.
If you want your C code to interoperate with C++ code, you will want to declare your externally visible symbols, such as function declarations, in the extern "C" namespace. This is often done using conditional compilation:
#ifdef __cplusplus
extern "C" {
#endif
/* C header file body */
#ifdef __cplusplus
}
#endif
Look at any header file and you will see something like this:
#ifndef _FILE_NAME_H
#define _FILE_NAME_H
//Exported functions, strucs, define, ect. go here
#endif /*_FILE_NAME_H */
These define are not only allowed, but critical in nature as each time the header file is referenced in files it will be included separately. This means without the define you are redefining everything in between the guard multiple times which best case fails to compile and worse case leaves you scratching your head later why your code doesn't work the way you want it to.
The compiler will also use define as seen here with gcc that let you test for things like the version of the compiler which is very useful. I'm currently working on a project that needs to compile with avr-gcc, but we have a testing environment that we also run our code though. To prevent the avr specific files and registers from keeping our test code from running we do something like this:
#ifdef __AVR__
//avr specific code here
#endif
Using this in the production code, the complementary test code can compile without using the avr-gcc and the code above is only compiled using avr-gcc.
If you had just mentioned #define, I would have thought maybe he was alluding to its use for enumerations, which are better off using enum to avoid stupid errors such as assigning the same numerical value twice.
Note that even for this situation, it is sometimes better to use #defines than enums, for instance if you rely on numerical values exchanged with other systems and the actual values must stay the same even if you add/delete constants (for compatibility).
However, adding that #if, #ifdef, etc. should not be used either is just weird. Of course, they should probably not be abused, but in real life there are dozens of reasons to use them.
What he may have meant could be that (where appropriate), you should not hardcode behaviour in the source (which would require re-compilation to get a different behaviour), but rather use some form of run-time configuration instead.
That's the only interpretation I could think of that would make sense.

Forward declare entities in C standard library?

Is it legal to forward declare structs and functions provided by the C standard library?
My background is C++ in which the answer is no. The primary reason for this is that a struct or class mandated by the C++ standard library can be a template behind the scenes and may have "secret" template parameters and so cannot be properly declared with a naive non-template declaration. Even if a user does figure out exactly how to forward declare a particular entity in a particular version of a particular implementation, the implementation is not obliged to not break that declaration in future versions.
I don't have a copy of any C standard at hand but obviously there are no templates in C.
So is it legal to forward declare entities in the C standard library?
Another reason that entities in the C++ standard library may not be forward declared is that headers provided by the implementation need not follow the normal rules. For example, in a recent question I asked if a C++ header provided by the implementation need be an actual file and the answer was no. I don't know if any of that applies to C.
The C standard library is used by both C and C++ but for this question I'm only asking about C.
Forward declarations of structs are always permissible in C. However, not very many types can be used this way. For example, you can't use a forward declaration for FILE simply because the tag name of the struct is not specified (and theoretically, it may not be a struct at all).
Section 7.1.4 paragraph 2 of n1570 gives you permission to do the same with functions:
Provided that a library function can be declared without reference to any type defined in a
header, it is also permissible to declare the function and use it without including its
associated header.
This used to be rather common. I think the reasoning here is that hard drives are slow, and fewer #include means faster compile times. But this isn't the 1980s any more, and we all have fast CPUs and fast hard drives, so a few #include aren't even noticed.
void *malloc(size_t);
void abort(void);
/* my code here */
yes you can this is perfectly valid.
this can be done with the standard library too.
double atof(const char *);
int main() {
double t = atof("13.37");
return 0;
}
#include <stdio.h>
Similiar things can be done with structs, variables etc.
I would recommend you read the wiki page which features some c examples:
http://en.wikipedia.org/wiki/Forward_declaration
this is specified in the c standard, Section 7.1.4 paragraph 2 of n1570
Provided that a library function can be declared without reference to any type defined in a header, it is also permissible to declare the function and use it without including its associated header.

C11 type-generic expressions - why not just add function overloading?

I was just reading the Wikipedia article on C11, the new version of the C standard released in Dec 2011, and I saw that one of the added features was "type-generic expressions":
Type-generic expressions using the _Generic keyword. For example, the
following macro cbrt(x) translates to cbrtl(x), cbrt(x) or cbrtf(x)
depending on the type of x:
#define cbrt(X) _Generic((X), long double: cbrtl, \
default: cbrt, \
float: cbrtf)(X)
This looks pretty horrible to me - if they are going to change the language anyways, why not just add function overloading like in C++?
C has one namespace for external symbols, and applies the ODR (One Definition Rule) such that two extern objects with the same name in two translation units must have the same definition.
Although it's possible to create a C ABI that supports overloading, the main strength of C is its ABI simplicity. On almost all platforms "the" ABI is the C ABI, and it plays some role in execution no matter the source language. This would be lost if symbols had to include type information.
TGE (as used by the library) is just a manually-operated version of name mangling. It does (or will do, sometime in the possibly very distant future) the job it needs to do, to allow typedef declarations to control generation of math-intensive inner loops. People who need the features of a language like C++ should port to C++.

Resources