I need to pass a double to my program, but the following does not work:
int main(int argc, char *argv[]) {
double ddd;
ddd=atof(argv[1]);
printf("%f\n", ddd);
return 0;
}
If I run my program as ./myprog 3.14 it still prints 0.000000.
Can somebody please help?
My guess is you forgot to include #include <stdlib.h>. If this is the case, gcc will issue the following warning:
warning: implicit declaration of function 'atof' [-Wimplicit-function-declaration]
And give the exact output you provided: 0.000000.
As remyabel indicated, you probably neglected to #include <stdlib.h>. The reason this matters is that, without having a declaration of atof(), the C standard mandates that the return value is assumed to be int. In this case, it's not int, which means that the actual behavior you observe (getting a return value of 0) is technically unspecified.
To be clear, without the double-returning declaration, the line ddd=atof(argv[1]) is treated as a call to an int-returning function, whose result is then cast to a double. It is likely the case that the calling conventions on the particular system you're on specify that ints and doubles get returned in different registers, so the 0 is likely just to be whatever happened to be in that particular register, while the double return value is languishing, unobserved.
In C you don't require to declare a function before you use it (in contrast with C++), and if that happens (no prototype), compiler makes some assumptions about that function. One of those assumptions is that it returns int. There's no error, atof() works, but it works incorrectly. It typically get whatever value happens to be in the register where int is supposed to be returned (it is 0 in your case, but it can be something else).
P.S. atof() and atoi() hide input errors (which you can always see by adding option -Wall to your gcc compiler call: gcc -Wall test.c), so most people prefer to use strtol() and strtod() instead.
Related
I am programming a Teensy micro-controller as a part of a C course and am trying to work out the value of one of my integer variables. I have an integer variable called Contrast, which is initialised to the value of a constant defined as a hexadecimal number at the beginning of the .c file:
#define LCD_DEFAULT_CONTRAST 0x3F
int Contrast = LCD_DEFAULT_CONTRAST;
I am trying to investigate how this Contrast value is stored and displayed, if it shows up as 63 or 0x3F, and if they are interchangeable. I tried to use:
printf("%d", Contrast);
to print out the Contrast value to the terminal and I got the error implicit declaration of function 'printf'. I thought printf() was part of the built-in C library, so I am confused why this is not working.
Can anyone please tell me how I print the value of this variable to the screen?
The implicit declaration error just means your compiler proper doesn't have a declaration for printf. Unless you're also getting a linker error, the linker (linking usually follows compilation, unless you pass -c to disable it) is probably slapping the standard lib right on, in which case you can simply solve your warning by including stdio.h or less preferably by declaring int printf(char const*, ...);.
If you trully don't have the standard lib, you'll need to convert the integer to a string manually with something like:
int n = 42;
char buf[20];
char *end = buf+(sizeof(buf)-1), *p = end;
*p--=0;
if(n==0) *p=='0';
else{
while(n){
printf("%d\n", n%10);
*p--=n%10+'0';
n/=10;
}
p++;
}
and then pass it to your system's raw IO routine for which you'll need to have set up the system-entering assembly.
If you don't have a system, it'd be even more technical, and you probably wouldn't be asking this question.
printf() is declared in standard library header <stdio.h>.
You have to #include <stdio.h> to use printf(). It is a library call, much like all other library calls in C..
With <stdlib.h> included the following code gives the output of 123.34.
#include<stdlib.h>
int main()
{
char *str="123.34";
float f= strtof(str,NULL);
printf("%f",f);
}
But without <stdlib.h> it produces the output of 33.000000.
What is the role of <stdlib.h> here and why did the value 33.00000 occur when it is nowhere in the code?
You must take a look at the warning generated by the compiler.
warning: implicit declaration of function 'strtof' [-Wimplicit-function-declaration]
This still yields result, which is not deterministic in any way because the return type expected is float, whereas without the header inclusion, the default is assumed to be int.
If you look into the stdlib header file, there is a declaration,
float strtof(const char *restrict, char **restrict);
With #include<stdlib.h>, we provide this declaration. When missed, compiler assumes to be returning int, and hence the result is not deterministic.
With my system, it produced 0.00000000 as the output, whereas with the necessary inclusions, I got 123.339996 as the output.
As a precaution, make a habit of always compiling the code with -Wall option (Assuming that you are using gcc), or better yet, -Werror option.
The <stdlib.h> header tells the compiler that strtof() returns a float(); in its absence, the compiler is forced to assume it returns an int. Modern C compilers (GCC 5 and above) complain about the absence of a declaration for strtof() and/or a conflict with its internal memorized declaration for strtof().
If you omit <stdlib.h>, your code is unacceptable in C99 and C11 because you didn't declare strtof() before using it. Since you omit <stdio.h>, it is invalid in C90, let alone C99 or C11. You must declare variadic functions such as printf() before using them.
I was testing a code and I don't understand why it prints "The higher value is 0".
This is the main function:
int main() {
double a,b,c;
a=576;
b=955;
c=higher(a,b);
printf("The higher value is %g\n", c);
return 0;
}
and in another .c I have this function:
double higher(double a, double b){
if (a>b)
return a;
return b;
}
NOTE: if I put higher() in main.c it works correctly, but in that way it tells me the higher value is 0. It also works if I cast the return in higer() like this:
return (int)b;
Is there a difference if a function that returns a double is in the same .c as main() or in a different one?
Compile with a C99 or C11 compiler and read the warnings. You are using the function without prototype.
Without a prototype, pre-C99 assumes a function to return int by default.
C99 and later, require a prototype.
Even without additional warnings enabled:
$ cat test.c
int main()
{
int i = f();
return 0;
}
int f(void)
{
return 1;
}
$ gcc -std=c11 test.c
test.c: In function ‘main’:
test.c:13:2: warning: implicit declaration of function ‘f’ [-Wimplicit-function-declaration]
int i = f();
Note that gcc will not warn if compiling -std=c90, but will if enabling warnings -Wall.
So, as higher() is expected to return an int, the value is converted to double by the assignment (the type of c is not changed).
And now for the funny part: undefined behaviour (UB, memorize this phrase!) due to different signature for call and implementation of the function.
What might happen is according to procedure call standard (PCS) and the application binary interface (ABI) - check Wikipedia. Briefly: higher itself returns a double. That is likely passed in a floating point CPU register to the caller. The caller, OTOH, expects the return value (due to the missing prototype) in an integer CPU register (which happens to hold 0 by chance).
So, as they apparently have misscommunication, you get the wrong result. Note that this is a bit speculatively and depends on the PCS/ABI. All to remember is this is UB, so anything can happen, even demons flying out of your nose.
Why use prototypes:
Well, you allready noticed, the compiler has no idea, if you call a function correctly. Even worse, it does not know, which argument types are used and which result type is returned. This is particlularily a problem, as C automatically converts some types (which you did encounter here).
As classical K&R (pre-standard) C did not have prototypes, all arguments to unknown functions were assumed int/double for scalar arguments on a call. The result defaults to int. (Long time ago, I might be missing some parts; I started some coding with K&R, messed up types (exactly your problem here, but without a clean solution), etc., threw it in a corner and happily programmed in Modula-2 until some years later I tried ANSI-C).
If compiling code now, you should at least conform to (and compile for) C99, better use the current standard (C11).
Hopefully this is a very simple question. Following is the C pgm (test.c) I have.
#include <stdio.h>
//#include <stdlib.h>
int main (int argc, char *argv[]) {
int intValue = atoi("1");
double doubleValue = atof("2");
fprintf(stdout,"The intValue is %d and the doubleValue is %g\n", intValue, doubleValue);
return 0;
}
Note that I am using atoi() and atof() from stdlib.h, but I do not include that header file. I compile the pgm (gcc test.c) and get no compiler error!
I run the pgm (./a.out) and here is the output, which is wrong.
The intValue is 1 and the doubleValue is 0
Now I include stdlib.h (by removing the comments before the #include) and recompile it and run it again. This time I get the right output:
The intValue is 1 and the doubleValue is 2
How come the compiler did not complain about not including the stdlib.h and still let me use the atoi(), atof() functions?
My gcc info:
$ gcc --version
gcc (GCC) 4.1.2 20070925 (Red Hat 4.1.2-27)
Any thoughts appreciated!
For historical reasons -- specifically, compatibility with very old C programs (pre-C89) -- using a function without having declared it first only provokes a warning from GCC, not an error. But the return type of such a function is assumed to be int, not double, which is why the program executes incorrectly.
If you use -Wall on the command line, you get a diagnostic:
$ gcc -Wall test.c
test.c: In function ‘main’:
test.c:5: warning: implicit declaration of function ‘atoi’
test.c:6: warning: implicit declaration of function ‘atof’
You should use -Wall basically always. Other very useful warning options for new code are -Wextra, -Wstrict-prototypes, -Wmissing-prototypes, -pedantic, and -Wwrite-strings, but compared to -Wall they have much higher false positive rates.
Tangentially: never use atoi nor atof, they hide input errors. Use strtol and strtod instead.
If you don't specify otherwise, I believe a C compiler will just guess that undeclared functions take the form extern int foo(). Which is why atoi works and atof doesn't. Which compiler flags were you using? I suggest using -Wall to turn on a bunch of gcc warnings, which should include referencing undeclared functions.
C allows you to call a function without having a declaration for that function.
The function will be assumed to return an int and arguments will be passed using default promotions. If those don't match what the function actually expects, you'll get undefined behavior.
Compilers will often warn for this case, but not always (and that will also depend on compiler configuration).
In C, when you use a function that was not declared, it assumes that it has the default prototype:
int FUNCTION_NAME();
Note that in C using () as prototype means it accepts any arguments.
If you compile with the flag -Wall (I recommend you to always use this flag, since it enables all recommended warnings) you will get a warning (not an error) telling you that you are using an undeclared function.
C, unfortunately, does not require functions to be prototyped (or even declared) before use -- but without a prototype, it automatically makes certain assumptions about the function. One of those is that it returns an int. In your case, atoi does return an int, so it works correctly. atof doesn't, so it doesn't work correctly. Lacking a prototype/declaration, you get undefined behavior -- typically it'll end up retrieving whatever value happens to be in the register where an int would normally be returned, and using that. It appears that in your particular case, that happens to be a zero, but it could just as easily be something else.
This is one of the reasons many people push "C++ as a better C" -- C++ does require that all functions be declared before use, and further that you specify the types of all (non-variadic) parameters as well (i.e. a C++ function declaration is like a C prototype, not like a C declaration).
Is declaring an header file essential? This code:
main()
{
int i=100;
printf("%d\n",i);
}
seems to work, the output that I get is 100. Even without using stdio.h header file. How is this possible?
You don't have to include the header file. Its purpose is to let the compiler know all the information about stdio, but it's by no means necessary if your compiler is smart (or lazy).
You should include it because it's a good habit to get into - if you don't, then the compiler has no real way to know if you're breaking the rules, such as with:
int main (void) {
puts (7); // should be a string.
return 0;
}
which compiles without issue but rightly dumps core when running. Changing it to:
#include <stdio.h>
int main (void) {
puts (7);
return 0;
}
will result in the compiler warning you with something like:
qq.c:3: warning: passing argument 1 of ‘puts’ makes pointer
from integer without a cast
A decent compiler may warn you about this, such as gcc knowing about what printf is supposed to look like, even without the header:
qq.c:7: warning: incompatible implicit declaration of
built-in function ‘printf’
How is this possible? In short: three pieces of luck.
This is possible because some compilers will make assumptions about undeclared functions. Specifically, parameters are assumed to be int, and the return type also int. Since an int is often the same size as a char* (depending on the architecture), you can get away with passing ints and strings, as the correct size parameter will get pushed onto the stack.
In your example, since printf was not declared, it was assumed to take two int parameters, and you passed a char* and an int which is "compatible" in terms of the invocation. So the compiler shrugged and generated some code that should have been about right. (It really should have warned you about an undeclared function.)
So the first piece of luck was that the compiler's assumption was compatible with the real function.
Then at the linker stage, because printf is part of the C Standard Library, the compiler/linker will automatically include this in the link stage. Since the printf symbol was indeed in the C stdlib, the linker resolved the symbol and all was well. The linking was the second piece of luck, as a function anywhere other than the standard library will need its library linked in also.
Finally, at runtime we see your third piece of luck. The compiler made a blind assumption, the symbol happened to be linked in by default. But - at runtime you could have easily passed data in such a way as to crash your app. Fortunately the parameters matched up, and the right thing ended up occurring. This will certainly not always be the case, and I daresay the above would have probably failed on a 64-bit system.
So - to answer the original question, it really is essential to include header files, because if it works, it is only through blind luck!
As paxidiablo said its not necessary but this is only true for functions and variables but if your header file provides some types or macros (#define) that you use then you must include the header file to use them because they are needed before linking happens i.e during pre-processing or compiling
This is possible because when C compiler sees an undeclared function call (printf() in your case) it assumes that it has
int printf(...)
signature and tries to call it casting all the arguments to int type. Since "int" and "void *" types often have same size it works most of the time. But it is not wise to rely on such behavior.
C supprots three types of function argument forms:
Known fixed arguments: this is when you declare function with arguments: foo(int x, double y).
Unknown fixed arguments: this is when you declare it with empty parentheses: foo() (not be confused with foo(void): it is the first form without arguments), or not declare it at all.
Variable arguments: this is when you declare it with ellipsis: foo(int x, ...).
When you see standard function working then function definition (which is in form 1 or 3) is compatible with form 2 (using same calling convention). Many old std. library functions are so (as desugned to be), because they are there form early versions of C, where was no function declarations and they all was in form 2. Other function may be unintentionally be compatible with form 2, if they have arguments as declared in argument promotion rules for this form. But some may not be so.
But form 2 need programmer to pass arguments of same types everywhere, because compiler not able to check arguments with prototype and have to determine calling convention osing actual passed arguments.
For example, on MC68000 machine first two integer arguments for fixed arg functions (for both forms 1 and 2) will be passed in registers D0 and D1, first two pointers in A0 and A1, all others passed through stack. So, for example function fwrite(const void * ptr, size_t size, size_t count, FILE * stream); will get arguments as: ptr in A0, size in D0, count in D1 and stream in A1 (and return a result in D0). When you included stdio.h it will be so whatever you pass to it.
When you do not include stdio.h another thing happens. As you call fwrite with fwrite(data, sizeof(*data), 5, myfile) compiler looks on argruments and see that function is called as fwrite(*, int, int, *). So what it do? It pass first pointer in A0, first int in D0, second int in D1 and second pointer in A1, so it what we need.
But when you try to call it as fwrite(data, sizeof(*data), 5.0, myfile), with count is of double type, compiler will try to pass count through stack, as it is not integer. But function require is in D1. Shit happens: D1 contain some garbage and not count, so further behaviour is unpredictable. But than you use prototype defined in stdio.h all will be ok: compiler automatically convert this argument to int and pass it as needed. It is not abstract example as double in arument may be just result of computation involving floating point numbers and you may just miss this assuming result is int.
Another example is variable argument function (form 3) like printf(char *fmt, ...). For it calling convention require last named argument (fmt here) to be passed through stack regardess of its type. So, then you call printf("%d", 10) it will put pointer to "%d" and number 10 on stack and call function as need.
But when you do not include stdio.h comiler will not know that printf is vararg function and will suppose that printf("%d", 10) is calling to function with fixed arguments of type pointer and int. So MC68000 will place pointer to A0 and int to D0 instead of stack and result is again unpredictable.
There may be luck that arguments was previously on stack and occasionally read there and you get correct result... this time... but another time is will fail. Another luck is that compiler takes care if not declared function may be vararg (and somehow makes call compatible with both forms). Or all arguments in all forms are just passed through stack on your machine, so fixed, unknown and vararg forms are just called identically.
So: do not do this even you feel lucky and it works. Unknown fixed argument form is there just for compatibility with old code and is strictly discouraged to use.
Also note: C++ will not allow this at all, as it require function to be declared with known arguments.