The macro: ISO_C_VISIBLE - c

What is the purpose of this macro: ISO_C_VISIBLE? I found it in the assert.h file.
Is it used to know which C version we are using?

Here's what an OpenBSD cdefs.h has to say about __ISO_C_VISIBLE and some related macros:
/*
* "The nice thing about standards is that there are so many to choose from."
* There are a number of "feature test macros" specified by (different)
* standards that determine which interfaces and types the header files
* should expose.
*
* Because of inconsistencies in these macros, we define our own
* set in the private name space that end in _VISIBLE. These are
* always defined and so headers can test their values easily.
* Things can get tricky when multiple feature macros are defined.
* We try to take the union of all the features requested.
*
* The following macros are guaranteed to have a value after cdefs.h
* has been included:
* __POSIX_VISIBLE
* __XPG_VISIBLE
* __ISO_C_VISIBLE
* __BSD_VISIBLE
*/
That particular cdefs.h sets __ISO_C_VISIBLE accrording to whatever POSIX specification level is configured.
So these are macros that BSD uses to attempt to 'condense' the other various feature macros used in the Unix world to configure a build environment into a more manageable set that other headers can rely on.
For example, setting the __ISO_C_VISIBLE macro appropriately (which the user will do indirectly by setting other documented feature macros) will allow older programs that might have names that conflict with C99 names continue to compile cleanly - if the build is properly configured, those conflicting C99 names will not be 'activated', the the use of those names in the user's program will not conflict.
If you look in the assert.h file for that OpenBSD source drop, you'll see:
# if __ISO_C_VISIBLE >= 1999
# define assert(e) ((e) ? (void)0 : __assert2(__FILE__, __LINE__, __func__, #e))
# else
# define assert(e) ((e) ? (void)0 : __assert(__FILE__, __LINE__, #e))
# endif
So, if the build is configured to use C99 features the assert macro will take advantage of C99's __func__ feature so an assertion will indicate which function the assertion was in. If the build is configured to indicate that C99 features should not be used, assert() won't do that.
Note that these macros are not a general standard - they seem to be mostly in the BSD world, but I'm sure you'll find other areas where they might be used (probably because files got borrowed from BSD).

A Google search for "ISO_C_VISIBLE" turns up this question and a handful of results for Nokia's Symbian operating system. The only description says:
__ISO_C_VISIBLE 1999
Description
Macro value to enable for ISO_C_VISIBLE
My guess is that the value 1999 refers to the 1999 ISO C standard, but I see no further explanation of what it means or how it's used, or of the distinction between __ISO_C_VISIBLE and ISO_C_VISIBLE. It seems odd because Symbian is primarily implemented and programmed in C++, not C. And I certainly wouldn't expect it to be defined in <assert.h> (assuming that when you say assert.h you're referring to the header that's included by #include <assert.h>).
If you're not working with Symbian, then I have no idea what it might be.
The proper way to determine which C standard your implementation conforms to is to use the predefined STDC and STDC_VERSION macros. For a conforming C90 or later implementation, __STDC__ expands to 1. For a C99 implementation, __STDC_VERSION__ expands to 199901L. For C11, it probably expands to 201112L, but I haven't seen an actual copy of the new standard.

These macros are used for configuring for the system compiling the code. Information gathered from the compiler is stored in them so the code is [shaped] according to the constructs supported by the hosting platform.
/* ... */
#ifdef _POSIX_C_SOURCE
#if _POSIX_C_SOURCE >= 200112
#define __POSIX_VISIBLE 200112
#define __ISO_C_VISIBLE 1999
#elif _POSIX_C_SOURCE >= 199506
#define __POSIX_VISIBLE 199506
/* ... */

Related

implicit declaration error even after including required header

I get the following error when I run my code:
error: implicit declaration of function ‘mkdtemp’ [-Werror=implicit-function-declaration]
This occurs even after including the correct header files for mkdtemp():
#include <stdlib.h>
Any ideas why this might be occurring?
The <stdlib.h> header is mandated by the C standard. The C standard makes no reference to a mkdtemp() function. If you're using gcc -std=c11 or some similar option, only the definitions provided by the C standard are exposed. If you compile using gcc -std=gnu11, then you'll get an indeterminate set of extension features enabled (and mkdtemp() would be one of them).
Since mkdtemp() is a POSIX function, you can explicitly request it by defining the appropriate enabling macro before including any standard header. A command-line option -D_XOPEN_SOURCE=700 would (probably) do the job, for example; there's also the option of using -D_POSIX_C_SOURCE=200809 but remembering the correct number is harder (it is the date of the POSIX 2008 standard as year and month).
Or you can place the appropriate #define at the top of the file:
#ifndef _XOPEN_SOURCE
#define _XOPEN_SOURCE 700
#endif
or:
#ifndef _POSIX_C_SOURCE
#define _POSIX_C_SOURCE 200809L
#endif
These stanzas allow you to override the POSIX version on the command line. Simply writing the #define without the conditional around it would generate a warning (or error) for a non-benign redefinition of the macro.
There used to be major differences between the POSIX and X/Open functionality — X/Open included some things that POSIX doesn't. This distinction is smaller these days, and generally, you'll not get into trouble using the X/Open macro.
There are other enabling macros for other platforms, but one of these two will enable the declaration of mkdtemp(). On Linux (RHEL 7.x), /usr/include/features.h) documents these enabling macros:
/* These are defined by the user (or the compiler)
to specify the desired environment:
__STRICT_ANSI__ ISO Standard C.
_ISOC99_SOURCE Extensions to ISO C89 from ISO C99.
_ISOC11_SOURCE Extensions to ISO C99 from ISO C11.
_POSIX_SOURCE IEEE Std 1003.1.
_POSIX_C_SOURCE If ==1, like _POSIX_SOURCE; if >=2 add IEEE Std 1003.2;
if >=199309L, add IEEE Std 1003.1b-1993;
if >=199506L, add IEEE Std 1003.1c-1995;
if >=200112L, all of IEEE 1003.1-2004
if >=200809L, all of IEEE 1003.1-2008
_XOPEN_SOURCE Includes POSIX and XPG things. Set to 500 if
Single Unix conformance is wanted, to 600 for the
sixth revision, to 700 for the seventh revision.
_XOPEN_SOURCE_EXTENDED XPG things and X/Open Unix extensions.
_LARGEFILE_SOURCE Some more functions for correct standard I/O.
_LARGEFILE64_SOURCE Additional functionality from LFS for large files.
_FILE_OFFSET_BITS=N Select default filesystem interface.
_BSD_SOURCE ISO C, POSIX, and 4.3BSD things.
_SVID_SOURCE ISO C, POSIX, and SVID things.
_ATFILE_SOURCE Additional *at interfaces.
_GNU_SOURCE All of the above, plus GNU extensions.
_REENTRANT Select additionally reentrant object.
_THREAD_SAFE Same as _REENTRANT, often used by other systems.
_FORTIFY_SOURCE If set to numeric value > 0 additional security
measures are defined, according to level.
Note, too, that the manual page for mkdtemp() shows what is needed:
NAME
mkdtemp - create a unique temporary directory
SYNOPSIS
#include <stdlib.h>
char *mkdtemp(char *template);
Feature Test Macro Requirements for glibc (see feature_test_macros(7)):
mkdtemp():
_BSD_SOURCE
|| /* Since glibc 2.10: */
(_POSIX_C_SOURCE >= 200809L || _XOPEN_SOURCE >= 700)
What I called 'enabling macros' are also known as 'Feature Test' macros.
See also POSIX System Interfaces: General Information: The Compilation Environment.

MISRA C 2012 - Rule 21.1 - Macros starting with underscore

Rule 21.1 in MISRA C 2012 states that
#define and #undef shall not be used on a reserved identifier or reserved macro name
This rule applies to identifier or macro beginning with an underscore
Rationale:
Removing or changing the meaning of a reserved macro may result in
undefined bahaviour
I don't understand why macro's name shall not start with an unerscore, even if it is not a reserved macro? For example in my header files:
#ifndef __MY_HEADER_
#define __MY_HEADER_
or in a library I'm using:
#define __I volatile const
Should I change all my code and the library I'm using (which is a big library) in order to conform to this rule or is there a simpler solution?
According to C standard (section 7.1.3), all identifiers starting with _[A_Z] or __ are reserved. As they are reserved, common sense and rule 21 forbid you to modify (redefine or undefine) them (or create your own).
Thus, you should change your code to not using leading underscores even in include guards not to mention your macros.
Some further reading can be found e.g. here: Include guard conventions in C

Is #define banned in industry standards?

I am a first year computer science student and my professor said #define is banned in the industry standards along with #if, #ifdef, #else, and a few other preprocessor directives. He used the word "banned" because of unexpected behaviour.
Is this accurate? If so why?
Are there, in fact, any standards which prohibit the use of these directives?
First I've heard of it.
No; #define and so on are widely used. Sometimes too widely used, but definitely used. There are places where the C standard mandates the use of macros — you can't avoid those easily. For example, §7.5 Errors <errno.h> says:
The macros are
EDOM
EILSEQ
ERANGE
which expand to integer constant expressions with type int, distinct positive values, and which are suitable for use in #if preprocessing directives; …
Given this, it is clear that not all industry standards prohibit the use of the C preprocessor macro directives. However, there are 'best practices' or 'coding guidelines' standards from various organizations that prescribe limits on the use of the C preprocessor, though none ban its use completely — it is an innate part of C and cannot be wholly avoided. Often, these standards are for people working in safety-critical areas.
One standard you could check the MISRA C (2012) standard; that tends to proscribe things, but even that recognizes that #define et al are sometimes needed (section 8.20, rules 20.1 through 20.14 cover the C preprocessor).
The NASA GSFC (Goddard Space Flight Center) C Coding Standards simply say:
Macros should be used only when necessary. Overuse of macros can make code harder to read and maintain because the code no longer reads or behaves like standard C.
The discussion after that introductory statement illustrates the acceptable use of function macros.
The CERT C Coding Standard has a number of guidelines about the use of the preprocessor, and implies that you should minimize the use of the preprocessor, but does not ban its use.
Stroustrup would like to make the preprocessor irrelevant in C++, but that hasn't happened yet. As Peter notes, some C++ standards, such as the JSF AV C++ Coding Standards (Joint Strike Fighter, Air Vehicle) from circa 2005, dictate minimal use of the C preprocessor. Essentially, the JSF AV C++ rules restrict it to #include and the #ifndef XYZ_H / #define XYZ_H / … / #endif dance that prevents multiple inclusions of a single header. C++ has some options that are not available in C — notably, better support for typed constants that can then be used in places where C does not allow them to be used. See also static const vs #define vs enum for a discussion of the issues there.
It is a good idea to minimize the use of the preprocessor — it is often abused at least as much as it is used (see the Boost preprocessor 'library' for illustrations of how far you can go with the C preprocessor).
Summary
The preprocessor is an integral part of C and #define and #if etc cannot be wholly avoided. The statement by the professor in the question is not generally valid: #define is banned in the industry standards along with #if, #ifdef, #else, and a few other macros is an over-statement at best, but might be supportable with explicit reference to specific industry standards (but the standards in question do not include ISO/IEC 9899:2011 — the C standard).
Note that David Hammen has provided information about one specific C coding standard — the JPL C Coding Standard — that prohibits a lot of things that many people use in C, including limiting the use of of the C preprocessor (and limiting the use of dynamic memory allocation, and prohibiting recursion — read it to see why, and decide whether those reasons are relevant to you).
No, use of macros is not banned.
In fact, use of #include guards in header files is one common technique that is often mandatory and encouraged by accepted coding guidelines. Some folks claim that #pragma once is an alternative to that, but the problem is that #pragma once - by definition, since pragmas are a hook provided by the standard for compiler-specific extensions - is non-standard, even if it is supported by a number of compilers.
That said, there are a number of industry guidelines and encouraged practices that actively discourage all usage of macros other than #include guards because of the problems macros introduce (not respecting scope, etc). In C++ development, use of macros is frowned upon even more strongly than in C development.
Discouraging use of something is not the same as banning it, since it is still possible to legitimately use it - for example, by documenting a justification.
Some coding standards may discourage or even forbid the use of #define to create function-like macros that take arguments, like
#define SQR(x) ((x)*(x))
because a) such macros are not type-safe, and b) somebody will inevitably write SQR(x++), which is bad juju.
Some standards may discourage or ban the use of #ifdefs for conditional compilation. For example, the following code uses conditional compilation to properly print out a size_t value. For C99 and later, you use the %zu conversion specifier; for C89 and earlier, you use %lu and cast the value to unsigned long:
#if __STDC_VERSION__ >= 199901L
# define SIZE_T_CAST
# define SIZE_T_FMT "%zu"
#else
# define SIZE_T_CAST (unsigned long)
# define SIZE_T_FMT "%lu"
#endif
...
printf( "sizeof foo = " SIZE_T_FMT "\n", SIZE_T_CAST sizeof foo );
Some standards may mandate that instead of doing this, you implement the module twice, once for C89 and earlier, once for C99 and later:
/* C89 version */
printf( "sizeof foo = %lu\n", (unsigned long) sizeof foo );
/* C99 version */
printf( "sizeof foo = %zu\n", sizeof foo );
and then let Make (or Ant, or whatever build tool you're using) deal with compiling and linking the correct version. For this example that would be ridiculous overkill, but I've seen code that was an untraceable rat's nest of #ifdefs that should have had that conditional code factored out into separate files.
However, I am not aware of any company or industry group that has banned the use of preprocessor statements outright.
Macros can not be "banned". The statement is nonsense. Literally.
For example, section 7.5 Errors <errno.h> of the C Standard requires the use of macros:
1 The header <errno.h> defines several macros, all relating to the reporting of error conditions.
2 The macros are
EDOM
EILSEQ
ERANGE
which expand to integer constant expressions with type int, distinct
positive values, and which are suitable for use in #if preprocessing
directives; and
errno
which expands to a modifiable lvalue that has type int and thread
local storage duration, the value of which is set to a positive error
number by several library functions. If a macro definition is
suppressed in order to access an actual object, or a program defines
an identifier with the name errno, the behavior is undefined.
So, not only are macros a required part of C, in some cases not using them results in undefined behavior.
No, #define is not banned. Misuse of #define, however, may be frowned upon.
For instance, you may use
#define DEBUG
in your code so that later on, you can designate parts of your code for conditional compilation using #ifdef DEBUG, for debug purposes only. I don't think anyone in his right mind would want to ban something like this. Macros defined using #define are also used extensively in portable programs, to enable/disable compilation of platform-specific code.
However, if you are using something like
#define PI 3.141592653589793
your teacher may rightfully point out that it is much better to declare PI as a constant with the appropriate type, e.g.,
const double PI = 3.141592653589793;
as it allows the compiler to do type checking when PI is used.
Similarly (as mentioned by John Bode above), the use of function-like macros may be disapproved of, especially in C++ where templates can be used. So instead of
#define SQ(X) ((X)*(X))
consider using
double SQ(double X) { return X * X; }
or, in C++, better yet,
template <typename T>T SQ(T X) { return X * X; }
Once again, the idea is that by using the facilities of the language instead of the preprocessor, you allow the compiler to type check and also (possibly) generate better code.
Once you have enough coding experience, you'll know exactly when it is appropriate to use #define. Until then, I think it is a good idea for your teacher to impose certain rules and coding standards, but preferably they themselves should know, and be able to explain, the reasons. A blanket ban on #define is nonsensical.
That's completely false, macros are heavily used in C. Beginners often use them badly but that's not a reason to ban them from industry. A classic bad usage is #define succesor(n) n + 1. If you expect 2 * successor(9) to give 20, then you're wrong because that expression will be translated as 2 * 9 + 1 i.e. 19 not 20. Use parenthesis to get the expected result.
No. It is not banned. And truth to be told, it is impossible to do non-trivial multi-platform code without it.
No your professor is wrong or you misheard something.
#define is a preprocessor macro, and preprocessor macros are needed for conditional compilation and some conventions, which aren't simply built in the C language. For example, in a recent C standard, namely C99, support for booleans had been added. But it's not supported "native" by the language, but by preprocessor #defines. See this reference to stdbool.h
Macros are used pretty heavily in GNU land C, and without conditional preprocessor commands there's be no way to properly handle multiple inclusions of the same source files, so that makes them seem like essential language features to me.
Maybe your class is actually on C++, which despite many people's failure to do so, should be distinguished from C as it is a different language, and I can't speak for macros there. Or maybe the professor meant he's banning them in his class. Anyhow I'm sure the SO community would be interested in hearing which standard he's talking about, since I'm pretty sure all C standards support the use of macros.
Contrary to all of the answers to date, the use of preprocessor directives is oftentimes banned in high-reliability computing. There are two exceptions to this, the use of which are mandated in such organizations. These are the #include directive, and the use of an include guard in a header file. These kinds of bans are more likely in C++ rather than in C.
Here's but one example: 16.1.1 Use the preprocessor only for implementing include guards, and including header files with include guards.
Another example, this time for C rather than C++: JPL Institutional Coding Standard for the C Programming Language . This C coding standard doesn't go quite so far as banning the use of the preprocessor completely, but it comes close. Specifically, it says
Rule 20 (preprocessor use)
Use of the C preprocessor shall be limited to file inclusion and simple macros. [Power of Ten Rule 8].
I'm neither condoning nor decrying those standards. But to say they don't exist is ludicrous.
If you want your C code to interoperate with C++ code, you will want to declare your externally visible symbols, such as function declarations, in the extern "C" namespace. This is often done using conditional compilation:
#ifdef __cplusplus
extern "C" {
#endif
/* C header file body */
#ifdef __cplusplus
}
#endif
Look at any header file and you will see something like this:
#ifndef _FILE_NAME_H
#define _FILE_NAME_H
//Exported functions, strucs, define, ect. go here
#endif /*_FILE_NAME_H */
These define are not only allowed, but critical in nature as each time the header file is referenced in files it will be included separately. This means without the define you are redefining everything in between the guard multiple times which best case fails to compile and worse case leaves you scratching your head later why your code doesn't work the way you want it to.
The compiler will also use define as seen here with gcc that let you test for things like the version of the compiler which is very useful. I'm currently working on a project that needs to compile with avr-gcc, but we have a testing environment that we also run our code though. To prevent the avr specific files and registers from keeping our test code from running we do something like this:
#ifdef __AVR__
//avr specific code here
#endif
Using this in the production code, the complementary test code can compile without using the avr-gcc and the code above is only compiled using avr-gcc.
If you had just mentioned #define, I would have thought maybe he was alluding to its use for enumerations, which are better off using enum to avoid stupid errors such as assigning the same numerical value twice.
Note that even for this situation, it is sometimes better to use #defines than enums, for instance if you rely on numerical values exchanged with other systems and the actual values must stay the same even if you add/delete constants (for compatibility).
However, adding that #if, #ifdef, etc. should not be used either is just weird. Of course, they should probably not be abused, but in real life there are dozens of reasons to use them.
What he may have meant could be that (where appropriate), you should not hardcode behaviour in the source (which would require re-compilation to get a different behaviour), but rather use some form of run-time configuration instead.
That's the only interpretation I could think of that would make sense.

How to #ifdef by CompilerType ? GCC or VC++

I used #ifdef Win32 for safe calls alike sprintf_s but now I want to build project with MinGW and it's just wrong now. I need to use #ifdef VC++ or somehow like that. Is it possible?
#ifdef __clang__
/*code specific to clang compiler*/
#elif __GNUC__
/*code for GNU C compiler */
#elif _MSC_VER
/*usually has the version number in _MSC_VER*/
/*code specific to MSVC compiler*/
#elif __BORLANDC__
/*code specific to borland compilers*/
#elif __MINGW32__
/*code specific to mingw compilers*/
#endif
See the "Microsoft-Specific Predefined Macros" table of Visual C predefined macros
You could check for _MSC_VER.
Preferably, you should resort to using portable symbols. I understand sometimes those symbols may not be defined, so you can see the Predef project for an extensive list of preprocessor macros regarding standards, compilers, libraries, operating systems and architectures that aren't portable.
However, the function you specifically mention in this question has been included within the C11 standard as a part of Annex K.3, the bounds-checking interfaces (library).
K.3.1.1p2 states:
The functions, macros, and types declared or defined in K.3 and its subclauses are declared and defined by their respective headers if __STDC_WANT_LIB_EXT1__ is defined as a macro which expands to the integer constant 1 at the point in the source file where the appropriate header is first included
Thus, you should place preference upon checking __STDC_WANT_LIB_EXT1__, and only use compiler-specific symbols when that doesn't exist.

Can an ANSI C-compliant implementation include additional functions in its standard library?

Is an ANSI C-compliant implementation allowed to include additional types and functions in its standard library, beyond those enumerated by the standard? (An ideal answer would reference the relevant part of the ANSI standard.)
I ask particularly because Mac OS 10.7 declares the getline function in stdio.h, even when compiling with gcc or clang using the -ansi flag. This breaks several older programs that define their own getline function. Is this a fault of Mac OS 10.7? (The man page for getline on Mac OS 10.7 says that getline conforms to the POSIX.1 standard, which came in 2008.)
Edit: To clarify, I find it odd that including stdio.h in an ANSI C89 program on Mac OS 10.7 also pulls in the declaration for the getline function, since getline is not one of the functions enumerated in the K&R (and presumably ANSI) description of stdio.h. In particular, attempting to compile noweb:
gcc -ansi -pedantic -c -o notangle.o notangle.c
In file included from notangle.nw:28:
getline.h:4: error: conflicting types for ‘getline’
/usr/include/stdio.h:449: error: previous declaration of ‘getline’ was here
Is it a bug in Mac OS 10.7 includes the declaration of getline in stdio.h even when compiling for the ANSI C89 standard?
From section 7.1.3 paragraph 2 of n1570 (which is a draft of C1x):
No other identifiers are reserved.
This is the part that means getline shouldn't be defined by the <stdio.h>, since it's not a reserved identifier according to the spec. So if your library defines getline in <stdio.h>, it's not technically compliant with the C standard...
However, you should be able to use the feature test macros to cause getline to be undefined in <stdio.h>.
#undef _POSIX_C_SOURCE
#define _POSIX_C_SOURCE 200112L
#include <stdio.h>
This will give you only the definitions from the older POSIX standards. This won't work on some GNU C++ implementations, which is ExTrEmeLY fruSTRaTiNG for some folks.
The relevant section of the manpage is (taken from a glibc manpage, sorry...)
Feature Test Macro Requirements for glibc (see feature_test_macros(7)):
getline(), getdelim():
Since glibc 2.10:
_POSIX_C_SOURCE >= 200809L || _XOPEN_SOURCE >= 700
Before glibc 2.10:
_GNU_SOURCE
This part of the manpage tells you which macros need to be defined to which values in order to get the definition. My bet is that _POSIX_C_SOURCE is already defined by your compiler to 200809L.
The idea of feature test macros is that if you define your macros, like _POSIX_C_SOURCE, _BSD_SOURCE, _XOPEN_SOURCE, etc. to the values you want, you won't need to worry about new library functions clashing with your existing functions. There is also _GNU_SOURCE, which turns everything on if you use glibc, but I suggest giving that macro a wide berth.
Yes, a compliant implementation is allowed to define additional identifiers, including functions, as long as they are one of the reserved identifiers in the standard. For example:
All identifiers that begin with an underscore and either an uppercase letter or another
underscore are always reserved for any use;
All identifiers that begin with an underscore are always reserved for use as identifiers
with file scope in both the ordinary and tag name spaces;
All external names that begin with is, to, str, mem or wcs followed by a lowercase letter;
In addition there are names that are reserved only if you include certain headers; for example, if you include <errno.h> then it can define any macro starting with E followed by a digit or uppercase letter.
However, getline() is not such a reserved name, and a compliant implementation must make it available for the programmer's own use.

Resources