gcc builtin function and custom function both invoked in same program - c

I'm trying to understand when gcc's builtin functions are used. In the following code, both gcc's sqrt() and my custom sqrt() are invoked when I compile without -fno-builtin. Can someone explain what is going on?
Also, I know the list of gcc's builtin functions is at https://gcc.gnu.org/onlinedocs/gcc/Other-Builtins.html and realize the recommended way around these types of problems is to just rename the conflicting function. Is there a gcc output option/warning that will show when a custom function is named the same as a builtin or when a builtin is used instead of the custom function?
#include <stdio.h>
double sqrt(double);
int main(void)
{
double n;
n = 2.0;
printf("%f\n", sqrt(n));
printf("%f\n", sqrt(2.0));
return 0;
}
double sqrt(double x)
{
printf("my_sqrt ");
return x;
}
running after compiling with gcc -o my_sqrt my_sqrt.c the output is:
my_sqrt 2.000000
1.414214
running after compiling with gcc -fno-builtin -o my_sqrt my_sqrt.c the output is:
my_sqrt 2.000000
my_sqrt 2.000000

It's not the case that two different sqrt functions are called at runtime. The call to sqrt(2.0) happens at compile time, which is legal because 2.0 is a constant and sqrt is a standard library function, so the compiler knows its semantics. And the compiler is allowed to assume that you are not breaking the rules. We'll get around to what that means in a minute.
At runtime, there is no guarantee that your sqrt function will be called for sqrt(n), but it might be. GCC uses your sqrt function, unless you declare n to be const double; Clang goes ahead and does the computation at compile time because it can figure out what n contains at that point is known. Both of them will use the built-in sqrt function (unless you specify -fno-builtin) for an expression whose value cannot be known at compile-time. But that doesn't necessarily mean that they will issue code to call a function; if the machine has a reliable SQRT opcode, the compiler could choose to just emit it rather than emitting a function call.
The C standard gives compilers a lot of latitude here, because it only requires the observable behaviour of a program to be consistent with the results specified by the semantics in the standard, and furthermore it only requires that to be the case if the program does not exhibit undefined behaviour. So the compiler is basically free to do any computation it wants at compile-time, provided that a program without undefined behaviour would produce the same result. [Note 1].
Moreover, the definition of "the same result" is also a bit loose for floating point computations, because the standard semantics do not prevent computations from being done with more precision than the data types can theoretically represent. That may seem innocuous, but in some cases a computation with extra precision can produce a different result after rounding. So if during compilation a compiler can compute a more accurate intermediate result than would result from the code it would have generated for run-time computation of the same expression, that's fine as far as the standard is concerned. (And most of the time, it will be fine for you, too. But there are exceptions.)
To return to the main point, it still seems surprising that the compiler, which knows that you have redefined the sqrt function, can still use the built-in sqrt function in its compile-time computation. The reason is simple (although often ignored): your program is not valid. It exhibits undefined behaviour, and when your program has undefined behaviour, all bets are off.
The undefined behaviour is specified in §7.1.3 of the standard, which concerns Reserved Identifiers. It supplies a list of reserved identifiers, which really are reserved, whether the compiler you happen to be using warns you about that or not. The list includes the following, which I'll quote in full:
All identifiers with external linkage in any of the following subclauses (including the future library directions) and errno are always reserved for use as identifiers with external linkage.
The "following subclauses" at point contain the list of standard library functions, all of which have external linkage. Just to nail the point home, the standard continues with:
If the program declares or defines an identifier in a context in which it is reserved (other than as allowed by 7.1.4), the behavior is undefined. [Note 2]
You have declared sqrt as an externally-visible function, and that's not permitted whether or not you include math.h. So you're in undefined behaviour territory, and the compiler is perfectly entitled to not worry about your definition of the sqrt function when it is doing compile-time computation. [Note 3]
(You could try to declare your sqrt implementation as static in order to avoid the restriction on externally-visible names. That will work with recent versions of GCC; it allows the static declaration to override the standard library definition. Clang, which is more aggressive about compile-time computations, still uses the standard definition of sqrt. And a quick test with MSVC (on godbolt.org) seems to indicate that it just outright bans redefinition of the standard library function.)
So what if you really really want to write sqrt(x) for your own definition of sqrt? The standard does give you an out: since sqrt is not reserved for macro names, you can define it as a macro which is substituted by the name of your implementation [Note 4], at least if you don't #include <math.h>. If you do include the header, then this is probably not conformant, because in that case the identifiers are reserved as well for macro names [Note 5].
Notes
That liberty is not extended to integer constant expressions, with the result that a compiler cannot turn strlen("Hello") into the constant value 5 in a context where an integer constant expression is required. So this is not legal:
switch (i) {
case strlen("Hello"):
puts("world");
break;
default: break;
}
But this will probably not call strlen six times (although you shouldn't count on that optimisation, either):
/* Please don't do this. Calling strlen on every loop iteration
* blows up linear-time loops into quadratic time monsters, which is
* an open invitation for someone to do a denial-of-service attackç
* against you by supplying a very long string.
*/
for (int i = 0; i < strlen("Hello"); ++i) {
putchar("world"[i]);
}
Up to the current C standard, this statement was paragraph 2 of §7.1.3. In the C23 draft, though, it has been moved to paragraph 8 of §6.4.2.1 (the lexical rules for identifiers). There are some other changes to the restrictions on reserved identifiers (and a large number of new reserved identifiers), but that doesn't make any difference in this particular case.
In many instances of undefined behaviour, the intent is simply to let the compiler avoid doing extra sanity checks. Instead, it can just assume that you didn't break the rules, and do whatever it would otherwise do.
Please don't use the name _sqrt, even though it will probably work. Names starting with underscores are all reserved, by the same §7.1.3. If the name starts with two underscores or an underscore followed by a capital letter, it is reserved for all uses. Other identifiers starting with an underscore are reserved for use at file scope (both as a function name and as a struct tag). So don't do that. If you want to use underscores to indicate that the name is somehow internal to your code, put it at the end of the indentifier rather than at the beginning.
Standard headers may also define the names of standard library functions as function-like macros, possibly in order to substitute a different reserved name, known to the compiler, which causes the generation of inline code, perhaps using special-purpose machine opcodes. Regardless, the standard requires that the functions exist, and it allows you to #undef the macros in order to guarantee that the actual function will be used. But it doesn't explicitly allow the names to be redefined.

Related

What happens when I leave out an argument to a function in C?

First of all, I know this way of programming is not good practice. For an explanation of why I'm doing this, read on after the actual question.
When declaring a function in C like this:
int f(n, r) {…}
The types of r and n will default to int. The compiler will likely generate a warning about it, but let's choose to ignore that.
Now suppose we call f but, accidentally or otherwise, leave out an argument:
f(25);
This will still compile just fine (tested with both gcc and clang). However there is no warning from gcc about the missing argument.
So my question is:
Why does this not produce a warning (in gcc) or error?
What exactly happens when it is executed? I assume I'm invoking undefined behaviour but I'd still appreciate an explanation.
Note that it does not work the same way when I declare int f(int n, int r) {…}, neither gcc nor clang will compile this.
Now if you're wondering why I would do such a thing, I was playing Code Golf and tried to shorten my code which used a recursive function f(n, r). I needed a way to call f(n, 0) implicitly, so I defined F(n) { return f(n, 0) } which was a little too many bytes for my taste. So I wondered whether I could just omit this parameter. I can't, it still compiles but no longer works.
While optimizing this code, it was pointed out to me that I could just leave out a return at the end of my function – no warning from gcc about this either. Is gcc just too tolerant?
You don't get any diagnostics from the compiler because you are not using modern "prototyped" function declarations. If you had written
int f(int n, int r) {…}
then a subsequent f(25) would have triggered a diagnostic. With the compiler on the computer I'm typing this on, it's actually a hard error.
"Old-style" function declarations and definitions intentionally cause the compiler to relax many of its rules, because the old-style code that they exist for backward compatibility with would do things like this all the dang time. Not the thing you were trying to do, hoping that f(25) would somehow be interpreted as f(25, 0), but, for instance, f(25) where the body of f never looks at the r argument when its n argument is 25.
The pedants commenting on your question are pedantically correct when they say that literally anything could happen (within the physical capabilities of the computer, anyway; "demons will fly out of your nose" is the canonical joke, but it is, in fact, a joke). However, it is possible to describe two general classes of things that are what usually happens.
With older compilers, what usually happens is, code is generated for f(25) just as it would have been if f only took one argument. That means the memory or register location where f will look for its second argument is uninitialized, and contains some garbage value.
With newer compilers, on the other hand, the compiler is liable to observe that any control-flow path passing through f(25) has undefined behavior, and based on that observation, assume that all such control-flow paths are never taken, and delete them. Yes, even if it's the only control-flow path in the program. I have actually witnessed Clang spit out main: ret for a program all of whose control-flow paths had undefined behavior!
GCC not complaining about f(n, r) { /* no return statement */ } is another case like (1), where the old-style function definition relaxes a rule. void was invented in the 1989 C standard; prior to that, there was no way to say explicitly that a function does not return a value. So you don't get a diagnostic because the compiler has no way of knowing that you didn't mean to do that.
Independently of that, yes, GCC's default behavior is awfully permissive by modern standards. That's because GCC itself is older than the 1989 C standard and nobody has reexamined its default behavior in a long time. For new programs, you should always use -Wall, and I recommend also at least trying -Wextra, -Wpedantic, -Wstrict-prototypes, and -Wwrite-strings. In fact, I recommend going through the "Warning Options" section of the manual and experimenting with all of the additional warning options. (Note however that you should not use -std=c11, because that has a nasty tendency to break the system headers. Use -std=gnu11 instead.)
First off, the C standard doesn't distinguish between warnings and errors. It only talks about "diagnostics". In particular, a compiler can always produce an executable (even if the source code is completely broken) without violating the standard.1
The types of r and n will default to int.
Not anymore. Implicit int has been gone from C since 1999. (And your test code requires C99 because for (int i = 0; ... isn't valid in C90).
In your test code gcc does issue a diagnostic for this:
.code.tio.c: In function ‘f’:
.code.tio.c:2:5: warning: type of ‘n’ defaults to ‘int’ [-Wimplicit-int]
It's not valid code, but gcc still produces an executable (unless you enable -Werror).
If you add the required types (int f(int n, int r)), it uncovers the next issue:
.code.tio.c: In function ‘main’:
.code.tio.c:5:3: error: too few arguments to function ‘f’
Here gcc somewhat arbitrarily decided not to produce an executable.
Relevant quotes from C99 (and probably C11 too; this text hasn't changed in the n1570 draft):
6.9.1 Function definitions
Constraints
[...]
If the declarator includes an identifier list, each declaration in the declaration list shall
have at least one declarator, those declarators shall declare only identifiers from the
identifier list, and every identifier in the identifier list shall be declared.
Your code violates a constraint (your function declarator includes an identifier list, but there is no declaration list), which requires a diagnostic (such as the warning from gcc).
Semantics
[...] If the
declarator includes an identifier list, the types of the parameters shall be declared in a
following declaration list.
Your code violates this shall rule, so it has undefined behavior. This applies even if the function is never called!
6.5.2.2 Function calls
Constraints
[...]
If the expression that denotes the called function has a type that includes a prototype, the
number of arguments shall agree with the number of parameters. [...]
Semantics
[...]
[...] If the number of arguments does not equal the number of parameters, the
behavior is undefined. [...]
The actual call also has undefined behavior if the number of arguments passed doesn't match the number of parameters the function has.
As for omitting return: This is actually valid as long as the caller doesn't look at the returned value.
Reference (6.9.1 Function definitions, Semantics):
If the } that terminates a function is reached, and the value of the function call is used by
the caller, the behavior is undefined.
1 The sole exception seems to be the #error directive, about which the standard says:
The implementation shall not successfully translate a preprocessing translation unit
containing a #error preprocessing directive unless it is part of a group skipped by
conditional inclusion.

Why does the following code give different results when compiling with gcc and g++?

#include<stdio.h>
int main()
{
const int a=1;
int *p=(int *)&a;
(*p)++;
printf("%d %d\n",*p,a);
if(a==1)
printf("No\n");//"No" in g++.
else
printf("Yes\n");//"Yes" in gcc.
return 0;
}
The above code gives No as output in g++ compilation and Yes in gcc compilation. Can anybody please explain the reason behind this?
Your code triggers undefined behaviour because you are modifying a const object (a). It doesn't have to produce any particular result, not even on the same platform, with the same compiler.
Although the exact mechanism for this behaviour isn't specified, you may be able to figure out what is happening in your particular case by examining the assembly produced by the code (you can see that by using the -S flag.) Note that compilers are allowed to make aggressive optimizations by assuming code with well defined behaviour. For instance, a could simply be replaced by 1 wherever it is used.
From the C++ Standard (1.9 Program execution)
4 Certain other operations are described in this International
Standard as undefined (for example, the effect of attempting to
modify a const object). [ Note: This International Standard imposes
no requirements on the behavior of programs that contain undefined
behavior. —end note ]
Thus your program has undefined behaviour.
In your code, notice following two lines
const int a=1; // a is of type constant int
int *p=(int *)&a; // p is of type int *
you are putting the address of a const int variable to an int * and then trying to modify the value, which should have been treated as const. This is not allowed and invokes undefined behaviour.
For your reference, as mentioned in chapter 6.7.3, C11 standard, paragraph 6
If an attempt is made to modify an object defined with a const-qualified type through use
of an lvalue with non-const-qualified type, the behavior is undefined. If an attempt is
made to refer to an object defined with a volatile-qualified type through use of an lvalue
with non-volatile-qualified type, the behavior is undefined
So, to cut the long story short, you cannot rely on the outputs for comaprison. They are the result of undefined behaviour.
Okay we have here 'identical' code passed to "the same" compiler but once
with a C flag and the other time with a C++ flag. As far as any reasonable
user is concerned nothing has changed. The code should be interpreted
identically by the compiler because nothing significant has happened.
Actually, that's not true. While I would be hard pressed to point to it in
a standard but the precise interpretation of 'const' has slight differences
between C and C++. In C it's very much an add-on, the 'const' flag
says that this normal variable 'a' should not be written to by the code
round here. But there is a possibility that it will be written to
elsewhere. With C++ the emphasis is much more to the immutable constant
concept and the compiler knows that this constant is more akin to an
'enum' that a normal variable.
So I expect this slight difference means that slightly different parse
trees are generated which eventually leads to different assembler.
This sort of thing is actually fairly common, code that's in the C/C++
subset does not always compile to exactly the same assembler even with
'the same' compiler. It tends to be caused by other language features
meaning that there are some things you can't prove about the code right
now in one of the languages but it's okay in the other.
Usually C is the performance winner (as was re-discovered by the Linux
kernel devs) because it's a simpler language but in this example, C++
would probably turn out faster (unless the C dev switches to a macro
or enum
and catches the unreasonable act of taking the address of an immutable constant).

Do I really need to include string.h? [duplicate]

What will happen if I don't include the header files when running a c program? I know that I get warnings, but the programs runs perfectly.
I know that the header files contain function declarations. Therefore when I don't include them, how does the compiler figure it out? Does it check all the header files?
I know that I get warnings, but the programs runs perfectly.
That is an unfortunate legacy of pre-ANSI C: the language did not require function prototypes, so the standard C allows it to this day (usually, a warning can be produced to find functions called without a prototype).
When you call a function with no prototype, C compiler makes assumptions about the function being called:
Function's return type is assumed to be int
All parameters are assumed to be declared (i.e. no ... vararg stuff)
All parameters are assumed to be whatever you pass after default promotions, and so on.
If the function being called with no prototype fits these assumptions, your program will run correctly; otherwise, it's undefined behavior.
Before the 1989 ANSI C standard, there was no way to declare a function and indicate the types of its parameters. You just had to be very careful to make each call consistent with the called function, with no warning from the compiler if you got it wrong (like passing an int to sqrt()). In the absence of a visible declaration, any function you call was assumed to return int; this was the "implicit int" rule. A lot of standard functions do return int, so you could often get away with omitting a #include.
The 1989 ANSI C standard (which is also, essentially, the 1990 ISO C standard) introduced prototypes, but didn't make them mandatory (and they still aren't). So if you call
int c = getchar();
it would actually work, because getchar() returns an int.
The 1999 ISO C standard dropped the implicit int rule, and made it illegal (actually a constraint violation) to call a function with no visible declaration. So if you call a standard function without the required #include, a C99-conforming compiler must issue a diagnostic (which can be just a warning). Non-prototype function declarations (ones that don't specify the types of the arguments) are still legal, but they're considered obsolescent.
(The 2011 ISO C standard didn't change much in this particular area.)
But there's still plenty of code out there that was written for C90 compilers, and most modern compilers still support the older standard.
So if you call a standard function without the required #include, what will probably happen is that (a) the compiler will warn you about the missing declaration, and (b) it will assume that the function returns int and takes whatever number and type(s) of arguments you actually passed it (also accounting for type promotion, such as short to int and float to double). If the call is correct, and if you compiler is lenient, then your code will probably work -- but you'll have one more thing to worry about if it fails, perhaps for some unrelated reason.
Variadic functions like printf are another matter. Even in C89/C90, calling printf with no visible prototype had undefined behavior. A compiler can use an entirely different calling convention for variadic functions, so printf("hello") and puts("hello") might generate quite different code. But again, for compatibility with old code, most compilers use a compatible calling convention, so for example the first "hello world" program in K&R1 will probably still compile and run.
You can also write your own declarations for standard functions; the compiler doesn't care whether it sees a declaration in a standard header or in your own source file. But there's no point in doing so. Declarations have changed subtly from one version of the standard to the next, and the headers that came with your implementation should be the correct ones.
So what will actually happen if you call a standard function without the corresponding #include?
In a typical working environment, it doesn't matter, because with any luck your program won't survive code review.
In principle, any compiler that conforms to C99 or later may reject your program with a fatal error message. (gcc will behave this way with -std=c99 -pedantic-errors) In practice, most compilers will merely print a warning. The call will probably work if the function returns int (or if you ignore the result) and if you get all the argument types correct. If the call is incorrect, the compiler may not be able to print good diagnostics. If the function doesn't return int, the compiler will probably assume that it does, and you'll get garbage results, or even crash your program.
So you can study this answer of mine, follow up by reading the various versions of the C standard, find out exactly which edition of the standard your compiler conforms to, and determine the circumstances in which you can safely omit a #include header -- with the risk that you'll mess something up next time you modify your program.
Or you can pay attention to your compiler's warnings (Which you've enabled with whatever command-line options are available), read the documentation for each function you call, add the required #includes at the top of each source file, and not have to worry about any of this stuff.
First of all: just include them.
If you don't the compiler will use the default prototype for undeclared functions, which is:
int functionName(int argument);
So it will compile, and link if the functions are available. But you will have problems at runtime.
There are a lot of things you can't do if you leave out headers:
(I'm hoping to get some more from the comments since my memory is failing on this ...)
You can't use any of the macros defined in the headers. This can be significant.
The compiler can't check that you are calling functions properly since the headers define their parameters for it.
For compatibility with old program C compilers can compile code calling functions which have not been declared, assuming the parameters and return value is of type int. What can happen? See for example this question: Troubling converting string to long long in C I think it's a great illustration of the problems you can run into if you don't include necessary headers and so don't declare functions you use. What happened to the guy was he tried to use atoll without including stdlib.h where atoll is declared:
char s[30] = { "115" };
long long t = atoll(s);
printf("Value is: %lld\n", t);
Surprisingly, this printed 0, not 115, as expected! Why? Because the compiler didn't see the declaration of atoll and assumed it's return value is an int, and so picked only part of the value left on stack by the function, in other words the return value got truncated.
That's why of the reasons it is recommended to compile your code with -Wall (all warnings on).

What will happen if I don't include header files

What will happen if I don't include the header files when running a c program? I know that I get warnings, but the programs runs perfectly.
I know that the header files contain function declarations. Therefore when I don't include them, how does the compiler figure it out? Does it check all the header files?
I know that I get warnings, but the programs runs perfectly.
That is an unfortunate legacy of pre-ANSI C: the language did not require function prototypes, so the standard C allows it to this day (usually, a warning can be produced to find functions called without a prototype).
When you call a function with no prototype, C compiler makes assumptions about the function being called:
Function's return type is assumed to be int
All parameters are assumed to be declared (i.e. no ... vararg stuff)
All parameters are assumed to be whatever you pass after default promotions, and so on.
If the function being called with no prototype fits these assumptions, your program will run correctly; otherwise, it's undefined behavior.
Before the 1989 ANSI C standard, there was no way to declare a function and indicate the types of its parameters. You just had to be very careful to make each call consistent with the called function, with no warning from the compiler if you got it wrong (like passing an int to sqrt()). In the absence of a visible declaration, any function you call was assumed to return int; this was the "implicit int" rule. A lot of standard functions do return int, so you could often get away with omitting a #include.
The 1989 ANSI C standard (which is also, essentially, the 1990 ISO C standard) introduced prototypes, but didn't make them mandatory (and they still aren't). So if you call
int c = getchar();
it would actually work, because getchar() returns an int.
The 1999 ISO C standard dropped the implicit int rule, and made it illegal (actually a constraint violation) to call a function with no visible declaration. So if you call a standard function without the required #include, a C99-conforming compiler must issue a diagnostic (which can be just a warning). Non-prototype function declarations (ones that don't specify the types of the arguments) are still legal, but they're considered obsolescent.
(The 2011 ISO C standard didn't change much in this particular area.)
But there's still plenty of code out there that was written for C90 compilers, and most modern compilers still support the older standard.
So if you call a standard function without the required #include, what will probably happen is that (a) the compiler will warn you about the missing declaration, and (b) it will assume that the function returns int and takes whatever number and type(s) of arguments you actually passed it (also accounting for type promotion, such as short to int and float to double). If the call is correct, and if you compiler is lenient, then your code will probably work -- but you'll have one more thing to worry about if it fails, perhaps for some unrelated reason.
Variadic functions like printf are another matter. Even in C89/C90, calling printf with no visible prototype had undefined behavior. A compiler can use an entirely different calling convention for variadic functions, so printf("hello") and puts("hello") might generate quite different code. But again, for compatibility with old code, most compilers use a compatible calling convention, so for example the first "hello world" program in K&R1 will probably still compile and run.
You can also write your own declarations for standard functions; the compiler doesn't care whether it sees a declaration in a standard header or in your own source file. But there's no point in doing so. Declarations have changed subtly from one version of the standard to the next, and the headers that came with your implementation should be the correct ones.
So what will actually happen if you call a standard function without the corresponding #include?
In a typical working environment, it doesn't matter, because with any luck your program won't survive code review.
In principle, any compiler that conforms to C99 or later may reject your program with a fatal error message. (gcc will behave this way with -std=c99 -pedantic-errors) In practice, most compilers will merely print a warning. The call will probably work if the function returns int (or if you ignore the result) and if you get all the argument types correct. If the call is incorrect, the compiler may not be able to print good diagnostics. If the function doesn't return int, the compiler will probably assume that it does, and you'll get garbage results, or even crash your program.
So you can study this answer of mine, follow up by reading the various versions of the C standard, find out exactly which edition of the standard your compiler conforms to, and determine the circumstances in which you can safely omit a #include header -- with the risk that you'll mess something up next time you modify your program.
Or you can pay attention to your compiler's warnings (Which you've enabled with whatever command-line options are available), read the documentation for each function you call, add the required #includes at the top of each source file, and not have to worry about any of this stuff.
First of all: just include them.
If you don't the compiler will use the default prototype for undeclared functions, which is:
int functionName(int argument);
So it will compile, and link if the functions are available. But you will have problems at runtime.
There are a lot of things you can't do if you leave out headers:
(I'm hoping to get some more from the comments since my memory is failing on this ...)
You can't use any of the macros defined in the headers. This can be significant.
The compiler can't check that you are calling functions properly since the headers define their parameters for it.
For compatibility with old program C compilers can compile code calling functions which have not been declared, assuming the parameters and return value is of type int. What can happen? See for example this question: Troubling converting string to long long in C I think it's a great illustration of the problems you can run into if you don't include necessary headers and so don't declare functions you use. What happened to the guy was he tried to use atoll without including stdlib.h where atoll is declared:
char s[30] = { "115" };
long long t = atoll(s);
printf("Value is: %lld\n", t);
Surprisingly, this printed 0, not 115, as expected! Why? Because the compiler didn't see the declaration of atoll and assumed it's return value is an int, and so picked only part of the value left on stack by the function, in other words the return value got truncated.
That's why of the reasons it is recommended to compile your code with -Wall (all warnings on).

C program is running without including Header files? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
C program without header
I have been studying C for a long time . but one thing that bothering me is that , today I made a C program and forget to include the stdio.h and conio.h header files I saved the file as kc.c ? when I compiled and run this .c file the output was as I was expecting to be.
but how can a C program could run without using the standard header file?
or I am not aware with the concepts that I am missing hear?
EDIT: the program
int main()
{
int i=12,j=34;
int *pa=&i,*pb=&j;
printf("the value of i and j is %d %d respectively ",*pa,*pb);
getch();
return 0;
}
because I have used the printf() function of STDIO.H header here ,but without including it how can It got compiled and run successfully?
The compiler is allowed to make things work, but is under no obligation to do so.
You are supposed to declare all variable argument list functions before using them; not declaring printf() properly leads to undefined behaviour (and one valid undefined behaviour is to work as expected).
You should be getting warnings about the undeclared functions if you compile in C99 mode (but Turbo C probably doesn't have a C99 mode).
Nit-picking:
[H]ow can a C program could run without using the standard header file?
All programs run without using any headers whatsoever. However, most programs are compiled using the standard headers to declare the standard functions, and the better programs ensure that all functions are declared before they are used (or are defined as static functions before they are used).
C99 requires this, though many compilers will allow errant programs to compile. However, the compilation should produce a diagnostic; the diagnostic may or may not lead to the compilation failing. In practice, it usually doesn't cause the compilation to fail, but it could and with some compilers (GCC, for example) you can force the compiler's hand (e.g. with GCC's -Werror=missing-prototypes -Werror=old-style-definition options).
When the language standard being applied pre-dates ISO C99, C does not require a function to be declared or defined before it is referenced.
However when the compiler encounters such a function call, it simply assumes that the function returns an int and that it takes an indeterminate number and type of parameters. This is called am implicit declaration. If you later declare the function, or call it with a different number of parameters or incompatible parameters, you may get a warning in some compilers that the second call does not match declaration implied by the first, but the ISO C89 standard treats the function as variadic [like printf()] where any number and type of parameters are allowed.
Moreover if the actual return value is not an int, any return value accepted and processed may not make much sense one way or another.
It is bad form to rely on an implicit declaration, and most compilers would issue a warning. If your compiler did not, you need to increase the warning level; those diagnostics are there to help improve your code quality. If you simply ignored the warning (any warning for that matter), then you shouldn't!
In C++ the rules are tighter and failure to declare or define a function before referencing it is an error, since it is necessary to allow function overloading.
A header file is nothing more than a listing of constants, preprocessor macros and function prototypes. The function prototypes tell C what arguments each function takes.
If the compiler sees a function being used without a corresponding prototype or function definition, it generates an implicit declaration of the form int func(). Since C functions are linked solely by name and not by function signature (as is the case with C++), the linker later locates the function definitions in the standard library.

Resources