intel compilers, silence commandline warnings - c

I just started building some code with the intel c compiler -- icc. Our configure script likes to add the -ffast-math flag and maybe a couple others which seem to be GCC specific. Invoking icc with -ffast-math produces the following warning which I would like to silence:
icc: command line warning #10006: ignoring unknown option '-ffast-math'
As far as I see it, there are 2 ways it could be silenced (But I'd love to see other solutions). First, I could turn that warning into an error which would tell configure that -ffast-math isn't a valid option. I would hope that when configure tries to add that to the commandline, it would then see it isn't able to and decide that maybe adding it was a bad idea after all ... The second option (which I don't think is quite as clean) is to just tell icc to silence that kind of warning ...
Responding to the comments, here's the relevant portion of configure.ac:
# add -ffast-math etc if possible
AX_CHECK_COMPILER_FLAGS([-ffast-math],
[CFLAGS="$CFLAGS -ffast-math"
])
AX_CHECK_COMPILER_FLAGS([-mtune=native -march=native],
[CFLAGS="$CFLAGS -mtune=native -march=native"
])
That m4 macro appears to have been taken from here
I suppose that fixing that to be smarter would be the "holy-grail" -- But as icc returns a successful exit status even when -ffast-math is passed (or -mtune=native etc.), I don't really think that there is too much that can be done there (feel free to prove me wrong). that said, I don't want to hard-code checks for intel into the configure script.... That seems overly messy.

Related

Why does optimizing with the -O(1/2/3) option interfere with scanf()?

I am trying to compile a C program using gcc with optimization enabled.
gcc [name] -lm
works fine, but as soon as I add -O3 it gives me this warning:
warning: ignoring return value of ‘scanf’, declared with attribute warn_unused_result [-Wunused-result]
What am I missing here?
Why does this warning depend on the optimization level? [from comment section]
The main reason is that there are certain analysis enabled by different optimization levels.
The absolute best way is to do as the compiler tells you. Not checking the return value of various functions is something that is the source of many bugs. Just do something like this:
if(scanf("%d%d", &x, &y) != 2) { /* Handle error * }
and the warning will disappear.
If you instead decide that you know better than the compiler and simply want it to shut up, then you can also do as the warning tells you, even if it's not so obvious how it is done. Use this code:
#pragma GCC diagnostic warning "-Wunused-result"
That method can be used for a lot of warnings. In this particular case, it's however more conventional to cast the result:
(void)scanf( ...
but that does not always work.
Do note that all of these have their pros and cons. If you for instance for some reason are able to prove that all input from stdin will have the right form and that you never fail to read from it, then by all means, deactivate the warning. Another thing could be that you have special needs, like extreme performance and/or very limited resources, or maybe some other reasons, it could be a wise choice to do a risk analysis for it and see if the risk is acceptable.
But as a general rule, do not inactivate compiler warnings unless you are very, very certain that the thing the compiler is warning about will not cause trouble. "I don't want to see a a lot of warnings" is not a good reason for disabling warnings.
This happens when you compile it with -O2 or -O3 optimization level.
You can either suppress the warning by using the following gcc command options
gcc -Wall -Wextra -Wno-unused-result file.c -o file
or you can also declare a variable (if no memory constraints) to store the return value of scanf() which avoids the warning.
int unused __attribute__((unused));
unused = scanf("%d",&e);

What's the warning under -fgcse about invoking -O2 on programs that use computed gotos?

I've written a program using computed gotos (yes, I know) and would like to know whether I can compile it with -O2. However, GCC's documentation is being rather unhelpful:
Please note the warning under -fgcse about invoking -O2 on programs that use computed gotos.
This message is profoundly unhelpful, and searches for similar terms just come up with that same documentation page.
What is this warning? What does it mean for my code?

GCC optimization flag problems

I am having a problem with some C code built with gcc compiler. The code in question has an enum, whose values are used as cases in a switch statement to configure an object. Fairly standard stuff.
When I compile the code using the -O0 option flag, everything builds and runs correctly no problem. However, when the flag is set to -O2 the code no longer works as expected.
When I step through the code and put a watch on the local variables, the enum, which should be only be one of three enum values, is actually -104! This causes the program to fail to configure the object.
Has anyone encountered this before who could provide some guidance? I haven't encountered this before and would appreciate if someone could explain why the compiler does this so I can make any necessary changes.
Snippet of code in question:
value = 0u;
switch(test_config) {
case DISABLE:
break;
case INTERNAL:
value = 1u;
break;
case EXTERNAL:
value = 2u;
break;
default:
valid = FALSE;
break;
}
if (valid) {
configure_test(value);
}
Enum in question:
typedef enum {
DISABLE,
INTERNAL,
EXTERNAL
} test_config_t;
This is the code that is causing the problem. I initially didn't include it because I didn't want the question to be please fix my code, rather I have been googling looking for reasons why gcc optimisation flags would produce different results for the same piece of code and haven't found anything particularly helpful. Also I am not at my computer and had to type this on my phone which also doesn't help. So I came here because there are experts here who know way more than me that could point me in the right direction.
Some more info that I probably should have included. The code runs on hardware which also might be the problem and I am looking into that as well. When ran from FSBL the code works with -O0, but not with -O2. So it may be hardware, but then I don't know why it works one way not the other.
You don't give enough details (since your question don't show any actual code, it should have some MCVE) but you very probably have some undefined behavior and you should be scared.
Remember that C11 or C99 (like most programming languages) is defined by an explicit specification (not only by the concrete behaviour observed on your code) written in English and partly defining the runtime behaviour of a valid C program. Read n1570.
I strongly recommend reading Lattner's blog What Every C programmer should know about Undefined Behavior before even touching or compiling your source code.
I recommend at least compiling with (nearly) all warnings and debug info, e.g. with gcc -Wall -Wextra -g, then improve the code to get no warnings, and run it under the gdb debugger and valgrind. Read more about Invoking GCC. You may also use (temporarily) some sanitizer instrumentation options, notably -fsanitize=undefined and -fsanitize=address. You could also add -std=gnu99 and -pedantic to your compiler flags. Notice that gdb watchpoints are a very useful debugger feature to find why a value has changed or is unexpected.
When you compile for release or for benchmarking with optimizations enabled, keep also the warning flags (so compile with gcc -O2 -Wall -Wextra); optimizations might give extra warnings which you should also correct. BTW, GCC accepts both -O2 and -g at the same time.
When you observe such issues, question first your own code before suspecting the compiler (because compilers are very well tested; I found only one compiler bug in almost 40 years of programming).

How to make gcc 4.7 warn about use of the infamous gets() function?

I saw yet another question about C where the code was using gets(),
and I commented with the usual warning about never using gets() except
when you want to demonstrate how to break security.
This time, I decided to check if my compiler issued a warning about the
use of gets(). Of course I expected it would. Has to, right? Even if
you don’t specify any warnings?
Imagine my surprise when I found that, not only does the compiler not
warn by default, but I couldn’t even figure out how to make it warn!
The compiler in question is gcc 4.7.2 on Debian, and here’s the code I
used:
#include <stdio.h>
int main(void)
{
char s[10];
gets(s);
puts(s);
return 0;
}
Tried gcc g.c. Compiles with no warnings. Runs. Gets a segfault if
you enter too much text.
Tried with all the standard warnings I normally put in a makefile:
gcc -W -Wall -Wno-long-long -Wshadow -Wlarger-than-1000 \
-Wpointer-arith -Wbad-function-cast -Wcast-qual -Wcast-align \
-Wconversion -Waggregate-return -Wmissing-prototypes \
-Wmissing-declarations -Wpadded -Wredundant-decls -Wnested-externs g.c
Same result.
Tried with -std=c11. Even that didn’t generate a warning, which is
pretty weird, considering gets() doesn’t even exist in C11. Tried
c99 too. No warning.
So what’s going on here? Why doesn’t this very widely used compiler warn
me when I use the most deprecated function in the entire C language?
EDIT: Acting on Keith Thompson’s suggestion below, I checked for a
deprecation attribute in stdio.h. It was not there. I then copied the
header file and experimented. Adding either of these strings (which I
found in other headers) to the end of the declaration did generate a
warning:
__attribute_deprecated__
__attribute__ ((__deprecated__))
The warning:
‘gets’ is deprecated (declared at /usr/include/stdiotz.h:632) [-Wdeprecated-declarations]
To summarize the responses I’ve seen so far, it appears that the version
of libc on my system doesn’t include the warning, which does exist in
later versions. This is strange, since the warning has existed in some
form since at least 1996. I vaguely recall that libc has been forked at
least once, so perhaps the warning was left out of one branch
significantly later than in other branches.
I think I will ask on the Debian mailing lists about this, and perhaps
report it as a bug, depending on what I learn.
EDIT 2: I’ve looked at some source code. glibc has had the warning
since 2007 at least, in libio/iogets.c. eglibc 2.13, the one I have,
has exactly the same warning code:
#ifdef _LIBC
link_warning (gets, "the `gets' function is dangerous and should not be used.")
#endif
I suppose _LIBC wasn’t defined when the library was compiled. Why, I
don’t know. I’m not sure what the purpose of _LIBC is.
So, the answer seems to come down to “It’s the library, and for whatever
reason, in their wisdom, the Debian developer responsible for it
compiled it that way.” We may never know why.
Not going to report it as a bug, since I’m using oldstable. Might bring
it up if it’s still that way after my next upgrade.
Thanks, everyone, for your informative responses!
It's not the GCC that includes this warning message, it's the GLIBC.
It's unlikely that you're using too old version of GLIBC: the warning has been around at least since 1996. See the line 67 of this GLIBC code on GitHub for an example (note the date: 15 Dec 1996):
link_warning (gets, "the `gets' function is dangerous and should not be used.")
Most likely you're using a different C library.

Disadvantages of using the `-Wextra` flag when compiling in GCC

I know that one should always compile with both -Wall and -Wextra as they enable warnings and help us to understand our mistake, if any.
I've read that the -Wextra compiler flag is not recommended to use because it is too verbose with a lot of false positives.
I was quite surprised on reading this. So I started googling about it but I didn't get any answer as all the search results showed was "what does the -Wextra flag do?".
So, my questions are
In which all situations does the -Wextra flag emit uneccessary warnings?
Is it possible to stop the -Wextra flag from enabling the other flags that cause GCC from emitting these types of warnings?
The point about the usefulness of -Wextra warnings is that the corresponding warnings have these properties (more or less):
They generate false positives.
The false positives are relatively frequent.
Adapting the code to avoid false positives is a nuisance.
The consequence is, that many projects that switch on -Wall -Wextra give up trying to avoid all the warnings. Consequently, when you compile such a project, you see hundreds and hundreds of warnings, almost all about legit code. And the one warning that you should have seen slips unnoticed in the endless warning stream. Worse: the fact that a normal compilation spits out so many warnings makes you get sloppy about avoiding the new warnings that your new code introduces. Once a project reaches a few tens of warnings, the fight is usually over; it will require a heroic effort to bring the warning count back to zero.
Here is an example of some perfectly legitimate code that is barked at by -Wextra:
void myClass_destruct(MyClass* me) {}
It's a destructor and it's empty. Yet, it should be there simply to facilitate calling it at the appropriate points (subclass destructors), so that it is easy to add stuff to MyClass that needs cleanup. However, -Wextra will bark at the unused me parameter. This forces programmers to write code like this:
void myClass_destruct(MyClass* me) {
(void)me; //shut up the compiler barking about the unused parameter
}
This is plain noise. And it gets annoying to write such code. However, in order to achieve a zero warning count under a -Wextra regime, this noise needs to be added to the code. So, you can pick any two of these three:
Clean code
Zero warning count
-Wextra warnings
Choose wisely which one you want to drop, you won't get all three.
Some of -Wextra warnings are enforcing appropriate coding style which can conflict with some people processes. This is the only reason to avoid it. My solution in this case is to 'correct' exact set using -Wno-* syntax.
For example -Wextra implies -Wempty-body which warns you about empty bodies of some control structures like if, else and while. But if you follow from one side 'iterative' implementation style and from another want serious amount of warnings, this will be uncomfortable for you.
Here is evolving code example:
void doSomething()
{
if (validatePreConditions())
{
//TODO Handle it some time.
}
// ... Here you have real code.
}
If you have -Werror, and want to start testing without pre-condition check implemeneted, you are in trouble.
But here is pitfall, what is acceptable for 'in-development' code is to be fixed by the time code is going to production. So my option for such cases is to have different set of warnings and different build profiles. So, for example, 'developer' build is OK to have empty bodies but anything going to be merged into main-line MUST NOT have such gaps.
Another example is -Wignored-qualifiers which is included into -Wextra. In some cases you know compiler will ignore your const qualifier but you're adding it just for your own comfort (I personally don't support these qualifiers on return type but some people think they do good job with such marks).
In which all situations does the -Wextra flag emit uneccessary
warnings?
What is your definition of unnecessary? You can consult the manual (this is for 4.9.2) to see exactly what the warning flags do. Then you can decide for yourself whether or not it's right for your code base.
Is it possible to stop the -Wextra flag from enabling the other flags
that cause GCC from emitting these types of warnings?
Stick -Wno- in front of a warning in order to disable it. Not all warnings have a warning flag, which means you probably have to stick with it. You can also generally find more information by browsing their bugtracker by searching for certain warning flags and seeing if any discussions exist. If it's not in the documentation, it might be there. The developers might also express rationale for why things are the way they are.
In order to address the comment you linked to in the comments,
...or warnings about missing field initializers (when you're
intentionally initializing a structure with { 0 }, making the compiler
zero-initialize everything else automagically).
This argument is moot because it's already been fixed. For example, the following code produces no missing initializer fields warning:
typedef struct { int a; int b; } S;
int main()
{
S s = { 0 };
}
It's very simple;
A useless warning is one that never points out an actual bug.
Of course this is undecidable, it's impossible to know beforehand if a particular warning will ever 'save the bacon'.
All GCC warnings have at sometime or other pointed out real bugs, so the general theme with the GCC messages are that -Wall warnings are likely to point at errors and are easy to suppress if they don't. So this plays well with -Werror.
The -Wextra messages, OTOH point at real errors less often or may be point out constructs that are common practice in some older code or are just difficult to change to an alternative way of expressing the code. So sometimes they may indeed be "too verbose".
For new code you will normally want -Wextra on.
For old code it should be turned on if it's not "too verbose".
There are basically three types of warnings, and GCC is not very good about grouping them or making this clear:
Warnings that indicate something formally invalid at the source level that should not even compile, but that's allowed for whatever reason (usually compatibility with legacy code). These are mandated by the language standard.
Warnings indicating that your program will necessarily invoke undefined behavior if it reaches the point in the code where the warning occurs. Your program could still be valid if the code is unreachable, and in fact this will likely occur with things like:
int i = 1;
if (INT_MAX > 0x7fffffff) <<= 31;
Warnings that are purely heuristic, and do not by themselves indicate any programming error or invalidity of your program, where the compiler is simply warning you that you may have intended something different from what you wrote.
-Wextra turns on a lot of "type 3" warnings.
Generally everyone agrees types 1 and 2 are valuable, but as mentioned above, even type 2 can have false positives. Type 3 generally has lots of false positives; some of them can be suppressed with harmless changes to the code, while others require actually making your code worse (or turning off the warning locally) to avoid them.
Here's where people will disagree. To some programmers, you just turn on -Wall -Wextra and locally work around or disable warnings where false positives turn up. Unless you do that, though, false positives create a lot of noise and potentially lower the signal to noise ratio, causing you not to notice the more-serious warnings by getting you used to seeing warnings when you compile.
There are (at least) two camps of C programmers: those that think the compiler may sometimes know better, and those that think who is the compiler to tell me how to write my code, after all, I know what I'd doing.
Camp 1 enables warnings as much as they can, use lint and other code checkers to even get more hints about questionable or potentially broken code.
Camp 2 doesn't like their sloppiness being pointed out all the time by a machine, let alone spend extra effort fixing what they think is perfect as-is.
Camp 1 is sought by the industry, because they produce less buggy code, less defects, less customer bug reports. If you're programming launch vehicles, satellite equipment, anything that needs safety, controls really expensive equipment, this camp's mindset is what is needed.
Camp 2 obviously gets away far too often and the general software quality of many applications you use (on your computer, smart phone, appliance, ...) is an indication of this.
What camp do you want to belong to? The disadvantages of -Wextra depend on this answer. One man's disadvantage is another man's advantage.

Resources