Assuming I have a code like:
int foo(int a) {
int b;
b = 0;
b = a+1;
return b;
}
The assignment b=0 is obviously redundant and not needed. Can gcc catch such errors?
I know it would be optimized out, but I also want to see a warning (and clean it up).
I thought the -Wunused-value should work with it, but small testing (with gcc version 11) proved me wrong. So which option of gcc can help?
Related
In plenty of other instances, gcc seems to be able to detect that the condition at the beginning of the loop initializes the variable. It even works if I drop the increment operator. It also goes away without the -02 flag. I've learned to trust that compiler warnings usually do mean something is wrong, so I'm wondering if there's something I'm missing or if it's just a weird compiler quirk.
void act(char **);
void test(int width, int height){
char * rowA[width];
char ** A;
for (int i = 0; i < width * height; ++i){
if (!(i % width)){
if (i){
act(rowA);
}
rowA[0] = 0;
A = rowA;
}
*A++ = "hello";
}
}
Compiling on gcc-6.3.0 with -Wall -Werror -02
Edit: to avoid confusion I've altered the code to be closer to the actual use case.
The warning means that the compiler couldn't prove the variable was always initialized before use.
This particular warning cannot be perfectly accurate of course, and it errs on the side of caution in that it does raise the warning if it wasn't sure. When using this warning it is common to get this sort of situation where you have to modify your correct code to make the warning go away.
Possibly it is also evidence of an optimizer bug; the optimizer should realize that if(!(i % argc)) can be optimized to if (i == 0).
Okay, so this is a stripped down variant of a bug I had. The bug was that I initialized an array using a variable that wasn't initialized. Earlier I used a function to declare the number of elements using a function, but after a cleanup I forgot about it and moved all declarations to the top of the function.
I used the flags -std=c99 -Wall -Wextra -pedantic -O, and usually gcc warns about values being used before they are uninitialized, but in this specific case it didn't. So, my question is:
Is this a bug in gcc or is it possible for f(&n) to post-initialize the array size in some weird way?
#include <stdio.h>
void f(int * x) {
*x = 8;
}
int main(void) {
int n;
float a[n]; // Compiler should warn that n may contain garbage
a[7] = 3.1415;
printf("%f\n", a[7]);
f(&n); // Removing this causes the compiler warn as expected
return 0;
}
EDIT: It may be this gcc bug?
GCC is accepting float a[n] as a variable-length array. It should, however, warn you that n contains garbage when it’s used. Perhaps VLA initialization is getting rearranged in a way that makes that fact non-obvious to the code generator? If n were initialized before use, moving the call to f() above the declaration of a would clearly be wrong, but this program produces undefined behavior.
Okay, so this is a stripped down variant of a bug I had. The bug was that I initialized an array using a variable that wasn't initialized. Earlier I used a function to declare the number of elements using a function, but after a cleanup I forgot about it and moved all declarations to the top of the function.
I used the flags -std=c99 -Wall -Wextra -pedantic -O, and usually gcc warns about values being used before they are uninitialized, but in this specific case it didn't. So, my question is:
Is this a bug in gcc or is it possible for f(&n) to post-initialize the array size in some weird way?
#include <stdio.h>
void f(int * x) {
*x = 8;
}
int main(void) {
int n;
float a[n]; // Compiler should warn that n may contain garbage
a[7] = 3.1415;
printf("%f\n", a[7]);
f(&n); // Removing this causes the compiler warn as expected
return 0;
}
EDIT: It may be this gcc bug?
GCC is accepting float a[n] as a variable-length array. It should, however, warn you that n contains garbage when it’s used. Perhaps VLA initialization is getting rearranged in a way that makes that fact non-obvious to the code generator? If n were initialized before use, moving the call to f() above the declaration of a would clearly be wrong, but this program produces undefined behavior.
I am currently working on a project including a somewhat generic linked list implementation using void pointers. Providing some utitily functions for these lists, I decided to make the identifying functions of elements only take (const void *).
After adding the const keyword were necessary, I thought about how correct my code is now (if I implemented everything as it should be before).
As the compiler (GCC) didnt warn me, I decided to take a test.
I compiled the following code with "gcc -g -Wall test.c" and received no warnings whatsover by GCC.
#include <stdio.h>
#include <stdlib.h>
void testf(const void *testp){
*((int32_t *) testp) += 1;
}
void testf2(const int32_t *testp){
*((int32_t *) testp) += 1;
}
int main(){
int32_t testv = 0;
printf("%i \n", testv);
testf(&testv);
printf("%i \n", testv);
testf2(&testv);
printf("%i \n", testv);
return 0;
}
The output is the following:
0
1
2
I did not expect that C would actually crash by this, but I expected to receive a warning by the compiler. In this example im only casting, in my real functions I'm also assigning the const void pointers to a tmp variable.
Is this a bug?
Given how sophisticated todays compilers are, Id atleast expect a warning that Im casting a pointer to a non const pointer. If I change the cast and add the const keyword there too, GCC throws the usual error that I try to assign to a read only location
Am I supposed to rethink my trust to functions declaring const pointers? This is not what I understand as a contract :)
Is this a bug?
No, it's not. By casting you are saying "I know what I am doing" to the compiler.
But GCC does have an option -Wcast-qual which would catch if a qualifier is casted away intentionally.
I compiled the posted code using:
gcc -c -ggdb -Wall -Wextra -pedantic -std=c99 filename.c -o filename.o
(the following list is abbreviated for ease of reading)
error: 'int32_t' undeclared
error: expected expression before ')' token
*((int32_t *) testp) += 1;
warning: unused parameter 'testp'
void testf(const void *testp){
with lots more warnings and errors regarding int32_t
Surely your compiler stated these same items.
BTW: The missing header file is stdint.h.
The following test.c program
int main() {
dummySum(1, 2);
return 0;
}
int dummySum(int a, int b) {
return a + b;
}
...doesn't generate any warning when compiled with gcc -o test test.c, whereas the following one does:
int main() {
dummySum(1, 2);
return 0;
}
void dummySum(int a, int b) {
a + b;
}
Why?
When faced with an undeclared function, the compiler assumes a function that accepts the given number of arguments (I think) and returns int (that part I'm sure of). Your second one doesn't, and so you get the redefinition warning.
I believe, based on a very quick scan of the foreward, that C99 (PDF link) removed this. No great surprise that GCC still allows them (with a warning), though; I can't imagine how much code would start failing to compile...
Recommend using -Wall (turning on all warnings) so you get a huge amount of additional information (you can turn off specific warnings when you have a really good reason for whatever you're doing that generates them if need be).
A function cannot be used before it has been declared. When a function declaration is not visible, the implementation assumes in C89 that the function:
takes an unspecified (but fixed) number of parameters
returns an int
This is called an implicit function declaration.
In C99, implicit declarations of function have been removed of the language and the implementation is free to refuse to translate the source code.