Have compiler warn if a function is used - c

I have C code with lots of calls strcmp and strcpy that is causing all kinds of problems.
I want to migrate this to strncmp and strncpy but I can not update all the code right now. I want to add compiler warning where ever the functions is used.
The following forces the substitution #define strcmp(x,y) strncmp16(x,y,64) but the problem is still in the code.
Is there a way to add an #warning so that the code still compile but will give a warning for not using the sized functions.
It is a large code base and must compile in four different compilers (GCC, IAR, GHS and VC). It is our own C Library mainly used in embedded systems.
Edit: I am not looking to find all occurrences. There is thousands of tools that can be used to find and replace them. I want there to be a warning so the next time somebody looks at the code they would evaluate and fix the code.
Edit: Strncmp & strncpy have lots of issues and I am very aware of that. I am making an informed decision. These function is in our own C library not just the default functions from the compilers C library.

While you can use #define to force errors, there is no mechanism in the C99 standard (and probably none in C11 either) to force a warning.
If you are using gcc, you can use
__attribute_deprecated__
to mark a prototype as deprecated, e.g.:
int strcmp(const char *, const char *) __attribute_deprecated__;

For Visual Studio:
prefix the function prototype with __declspec(deprecated) as seen in MSDN
You'll need to raise the warning level to 3+.
Example:
#pragma deprecated(strcpy, strcmp)
This line will cause every call to either function to omit a C4995 warning.
These specific functions already emit a C4996 warning but you turn that warning off via a pragma:
#pragma warning(disable: 4996)

Related

Why doesn't gcc require I `#include stdio.h` in my `helloworld.c` [duplicate]

What is meant by the term "implicit declaration of a function"? A call to a standard library function without including the appropriate header file produces a warning as in the case of:
int main(){
printf("How is this not an error?");
return 0;
}
Shouldn't using a function without declaring it be an error? Please explain in detail. I searched this site and found similar questions, but could not find a definitive answer. Most answers said something about including the header file to get rid of the warning, but I want to know how this is not an error.
It should be considered an error. But C is an ancient language, so it's only a warning.
Compiling with -Werror (GCC) fixes this problem.
When C doesn't find a declaration, it assumes this implicit declaration: int f();, which means the function can receive whatever you give it, and returns an integer. If this happens to be close enough (and in case of printf, it is), then things can work. In some cases (e.g., the function actually returns a pointer, and pointers are larger than ints), it may cause real trouble.
Note that this was fixed in newer C standards (C99 and C11). In these standards, this is an error. However, GCC doesn't implement these standards by default, so you still get the warning.
Implicit declarations are not valid in C.
C99 removed this feature (present in C89).
GCC chooses to only issue a warning by default with -std=c99, but a compiler has the right to refuse to translate such a program.
To complete the picture, since -Werror might considered too "invasive",
for GCC (and LLVM), a more precise solution is to transform just this warning in an error, using the option:
-Werror=implicit-function-declaration
See How can I make this GCC warning an error?.
Regarding general use of -Werror: Of course, having warningless code is recommendable, but in some stage of development it might slow down the prototyping.
Because of historical reasons going back to the very first version of C, it passes whatever type the argument is. So it could be an int or a double or a char*. Without a prototype, the compiler will pass whatever size the argument is and the function being called had better use the correct argument type to receive it.
For more details, look up K&R C.
An implicitly declared function is one that has neither a prototype nor a definition, but is called somewhere in the code. Because of that, the compiler cannot verify that this is the intended usage of the function (whether the count and the type of the arguments match). Resolving the references to it is done after compilation, at link-time (as with all other global symbols), so technically it is not a problem to skip the prototype.
It is assumed that the programmer knows what he is doing and this is the premise under which the formal contract of providing a prototype is omitted.
Nasty bugs can happen if calling the function with arguments of a wrong type or count. The most likely manifestation of this is a corruption of the stack.
Nowadays this feature might seem as an obscure oddity, but in the old days it was a way to reduce the number of header files included, hence faster compilation.
C is a very low-level language, so it permits you to create almost any legal object (.o) file that you can conceive of. You should think of C as basically dressed-up assembly language.
In particular, C does not require functions to be declared before they are used. If you call a function without declaring it, the use of the function becomes its (implicit) declaration. In a simple test I just ran, this is only a warning in the case of built-in library functions like printf (at least in GCC), but for random functions, it will compile just fine.
Of course, when you try to link, and it can't find foo, then you will get an error.
In the case of library functions like printf, some compilers contain built-in declarations for them so they can do some basic type checking, so when the implicit declaration (from the use) doesn't match the built-in declaration, you'll get a warning.

Why do we include stdlib.h?

C function malloc() is defined under stdlib.h.
It should give an error if we don't include this file, but this code works fine with a little warning.
My question is, if malloc() works without this header file, then why do we need to include it? Please help clarify my concepts.
# include <stdio.h>
int main()
{
int a, b, *p;
p = (int*)malloc(sizeof(int)*5);
for(a=0;a<5;a++)p[a]=a*9;
for(b=0;b<5;b++)printf("%d ",p[b]);
}
In C unfortunately you don't need pre-declaration for functions. If the compiler encounters with a new function it will create an implicit declaration for it ("mmm`kay, this how it is used so I will assume that the type of the arguments are..").
Do not rely on this "feature" and in general do not write code that compiles with warnings.
Read the warning. It says it's invalid. The compiler is simply too kind to you. In Clang this works, but it might not in other compilers.
At least include it to suppress the warning. Unnecessary warnings are annoying. Any program should compile with warnings treated as errors (I always enable that).
It appears that that's your compiler's magic. Not including the necessary headers may work on your compiler (which I suppose is by Microsoft), but it won't necessarily compile elsewhere(that includes future versions of the same compiler). Write standard-conforming, portable code.
stdlib.h is of the general purpose standard header which includes functions of Dynamic Memory allocation and other Standard Functions.
For example if you want to display a message at the end of the execution of your program you will need to go for the getch() function,this functions reads a character from keyboard thus giving user the time to read the displayed Information.
The getch() function requires the stdlib header to be Included.
Like many things in c the reason that an error isn't generated when there is no prototype is for historical reasons. In the early days people often didn't bother prototyping functions because pointers and integers were usually the same size and integral types smaller than an integer were promoted to an integer when passed as a parameter (and floating point was rarely used for systems programming).
If at any point they had changed the compiler to give an error if a function was not prototyped then it would have broken many programs and would not have gained widespread acceptance.
With 64 bit addressing we are now entering a period when integers and pointers are not the same size and programs will most likely break if you do not prototype functions like malloc() that return a pointer.
In gcc always set the following options for your own programs: -Werror -Wstrict-prototypes

Is there a gcc command line option to silence warning: passing argument n discards qualifiers from type

I'm trying to compile -Wall -Werror and its cramping my style.
I'm trying to make explicit that certain arguments are constants and then passing them to non const qualifying functions inside a large library.
P.S. I was mostly doing this to try to make it clear that certain variables are constants, is it good or bad c style to do this when dealing with a library functions that don't use const?
If you are passing those constants into routines as reference parameters or by pointer, then there may be a damn good reason for those warnings. How do you know that those routines won't modify your "constants"? What is that gonna screw up in the rest of your code, which you told that those variables won't ever change?
If you really know for sure that what you are doing is safe, and there is no good way to recode things to get rid of the warning, you can turn some warnings off in gcc using pragmas. Do this for as small an area of code as possible, and comment why you are doing it.
Do not abuse this privelege, or you are liable to arrested by the code police and sentenced to 9 months of community service coding in Ada. That'll cure you of ever complaining about C's warnings again.
Use the -Wno-ignored-qualifiers switch.
Sometimes, when compiling with -Wall -Wextra -Werror (as I do too because it is very good practice), you face recurring warnings that you may want to disable project wide, or on a per source file basis. One that I disable often in my projects for instance is -Wno-long-long. This is not bad practice, because you know what you are doing, and you don't want to control third party code.
As I understand though, you are trying to disable the warning for specific parts of the code, since otherwise it would ruin your effort putting const everywhere. In this case, do:
#pragma GCC diagnostic push
#pragma GCC diagnostic ignored "-Wignored-qualifiers"
OffendingThirdPartyFunction(MyConstParam);
#pragma GCC diagnostic pop
or also (untested, I don't know where to put the semicolons, and I don't have a working GCC here at work)
#define NO_WARNING(expr) \
_Pragma("GCC diagnostic push") \
_Pragma("GCC diagnostic ignored \"-Wignored-qualifiers\"") \
expr \
_Pragma("GCC diagnostic pop")
NO_WARNING(OffendingThirdPartyFunction(MyConstParam));
Alternatively, you can use a cast. This is by far the most portable solution.
OffendingThirdPartyFunction((param_t*)MyConstParam);
Don't use a command-line option: the warning tells you that your code is not const-safe. It's right, your code isn't const-safe, although that's the fault of whoever wrote the library you're using.
If you disable the warnings then you won't get them any more even if you write code that's unsafe and it is your fault.
Instead, whenever one of the library functions takes a pointer-to-non-const, but guarantees not to modify the referand of the pointer, then cast the const-ness away from your pointer, and pass that. The cast serves as a record in the code that you (the programmer) are claiming that nothing invalid will happen, even though the type system can't prove that, and should probably be commented.
For example:
// in this library
void print_int(int *p) { printf("%d\n", *p); }
void set_int(int *p) { *p = 6; }
// your code
const int n = 5;
print_int((int*)(&n)); // warning suppressed for this call only,
// having carefully checked the print_int docs.
// further down your code
set_int(&n); // You *want* the compiler to stop this!
Or if you can be bothered (because you have a lot of such calls), write wrappers for the offending library functions:
void print_int_const(const int *p) { print_int((int*)(p)); }
// but no wrapper for set_int
Be aware that the cast also removes volatile (and in general accepts a lot of incorrect inputs). The overload prevents you accidentally using completely the wrong type, while the in-place cast doesn't.

Is it necessary to write these headers in C?

I want a list of Header files in C which are not necessary to use them.
Example:
scanf(), printf(),.... etc. //can be use without stdio.h
getch()...etc. //can be used without conio.h
Is is necessary to write these headers(stdio.h, conio.h) while I use these(above) methods?
Using functions without prototypes is deprecated by the current standard, C99, and will probably be removed in the next version. And this is for a very good reason. Such usage is much error prone and leads to hard to track faults. Don't do that.
In the current C language standard, function declarations (but not prototypes) are mandatory, and prototypes have always been mandatory for variadic functions like printf. However, you are not required to include the headers; you're free to declare/prototype the functions yourself as long as you have the required types available. For example, with printf, you could do:
int printf(const char *, ...);
printf("%d\n", 1);
But with snprintf, you would need at least stddef.h to get size_t:
#include <stddef.h>
int snprintf(char *, size_t, const char *, ...);
And with non-variadic functions, a non-prototype declaration is valid:
int atoi();
Depends on your compiler.
For GCC: http://gcc.gnu.org/viewcvs/trunk/gcc/builtins.def?view=markup
Basically you can use any C function without a header. The header contains the prototypes for these functions making it possible to warn about wrong type or number of parameters.
So yes you can do without these headers.
But no you shouldn't do this.
But normally you don't write these header, these are provided by the build environment.
Another thing since these checks are only warnings in C you should switch on these warnings and treat them like errors. Otherwise you are in for a very bad C experience.
In gcc you should always run with options -W -Walland avoid all warnings these give.
An BTW these are not methods but functions.
Addendum: since you are gonna treat all warnings as errors you might turn on -Werror which turns all warnings into error just enforcing this.
Personally I'm not using this option but have the discipline to clean out all warnings in the end. This makes it possible to ignore the warnings for a while and usually do a clean up before I commit to version control.
But certainly for groups it makes sense to enforce this with -Werror e.g. in test scripts run before allowing commits.

Code still runs without any includes (Bloodshed's Dev-C++)

I am currently learning and experimenting with C and am using Bloodshed's DEV-C++ as an IDE.
Now, I just realized that the following piece of code (as it is...no includes or nothing) compiles and runs :
main ()
{
printf("%d", strlen("hello"));
}
Now, if I'm not mistaken, shouldn't two header files be included in this source for it to work ? stdio.h and string.h...but as you can see, I did not add them and the code still compiled and ran successfully.
My complaint is that I want the compiler to be "strict" because since I'm still learning C, I don't want the code to run if normally it shouldn't.
So, is there any way to prevent Dev-C++ from 'correcting my mistakes' when it comes to includes, ie making it more kinda "strict" ?
C90 had a feature (absent of C99 and C++) called implicit function declaration: when you used a name not declared yet in a function call, the compiler behaved as if
extern int identifier();
had been seen. That feature has been dropped from C99 and most compilers had option to warn about this even before C99 was promulgated.
Even when staying in C90, it is not recommended style to use this. If you have to maintain code making use of this and can't add prototypes, check that:
the functions returns an int (it is the case for printf but the validity is implementation dependent for strlen which returns a size_t which can be int or something else)
the function isn't variadic (it is the case for strlen but not printf)
the type of the arguments is not modified by default argument promotions (char, short, float are) and you must pay attention to cast pointers to void* when needed when the expected type is void*, you have to pay attention to cast NULL to the correct pointer type. (These are the same things you have to pay attention for variadic arguments BTW).
If those conditions aren't met -- and they aren't for any calls in your code -- you enter in the realm of undefined behavior.
I don't know if this actually is a DevC++ issue, but in any case you should consider ditching it. It is no longer being developed and is very buggy. I recommend changing to Code::Blocks, which is better in every way and alows you to use the very latest GCC compiler.
One of the possibilities for 'undefined behaviour' - which you get if you call a variadic function without a visible prototype - is that your code compiles and runs successfully.
If you're using gcc as the underlying compiler then you should be able to pass flags such as -std=c89 -pedantic -Wall -Wextra and get warnings about code such as the snippet that you've posted.

Resources