MinGW compiler doesn't need function declarations? - c

I have these two files:
// first.c
int main(void) {
putint(3);
}
and
// second.c
#include <stdio.h>
void putint(int n) {
printf("%d",n);
getchar();
}
When I run gcc 4.6.1 under Win XP:
gcc first.c second.c -o program.exe
It has no problem and writes 3 to stdout. It doesn't need putint declaration in first.c. How is this possible? Is this standard behavior?
I have tested this on MSVC 2008 Express and it runs only with the declaration as expected.
// first.c
void putint(int);
int main(void) {
putint(3);
}
Solved, thanks for hints, these options helped to show the warning:
-Wimplicit
-std=c99 (MinGW 4.6 still uses gnu90 by default)

This is a legacy "feature" of C that should not be used as of several decades ago. You should use a compiler with settings that will warn you if you do something like this. Gcc has several switches that you should specify when using it & one of them will give you a warning for this.
Edit: I haven't been using gcc myself, but switches that you should check out are -pedantic, -Wall, -Wextra, and -std.
The compiler that is accepting this is assuming, per the old language definition, that since you didn't see fit to tell it otherwise, the function a) returns an int value and b) since you pass it an int (or if you passed it something that could be promoted to an int) the function expects that argument to be an int.
As #veer correctly points out, this should generally work in your particular case. In other cases, however, differences between the implicit assumptions for a function without a prototype and the function's actual signature would make things go boom.

This isn't just for MinGW, but all standard versions of gcc. As noted, this is legal in C89; gcc defaults to 'gnu89' (not 99), which also accepts the code without warning. If you switch to c99 or gnu99 (or later, such as c11) you'll get a warning by default, but it will still compile.

As is noted by others, this is standard behavior for C conforming compilers. Naming your files .c partially puts it in C mode. It'll have fun things like "built-in functions" (printf() etc.) and all sorts of legacy C things.
I'd like to add to what others have said that I experienced recently, though. MS expressly dropped support for C past C90, and their C90 support is poor to say the least. I'm not entirely sure standard ANSI C90 codebases would compile under newer VS's, because it is basically the C++ compiler with lots of stuff disabled (whereas GCC actually has a C compiler). They did this in order to promote C++. If you want to use real C, you can't really do it in MS Visual Studio, any edition, unless you want to be declaring all your variables at the start of functions, etc.

Related

Is it possible to compile C89 code on MS Windows?

I'm trying to work with some legacy C89 code, and am having trouble getting it to build. My usual environment is Visual Studio, but that only seems to support C99, and some C99 features (such as stdio etc. not necessarily being constant) break the code - a lot. Before I start tampering with the code I want to write some tests, so I don't break the old behaviour, but I can't test the tests, so to speak, before I can get the code to build.
So is there still any way to compile C89 code on Windows?
Edit: Steve Summit has identified that stdio and so on has never been guaranteed; it's just a feature of some compilers that my legacy code happens to depend on, in a rather deeply embedded way. So my question shifts to whether there is any Windows C compiler available (preferably free!) for Windows that supports that assumption. Alternatively, I have an Ubuntu installation in a virtual machine, although I have little experience using it - is there such a compiler available in Ubuntu?
MSVC is a C++ compiler and has just gained C99 support recently. Previously it supports only C89 with some MS extensions. To compile in strict C89 mode use the /Za option. Make sure to also enable /Tc to use C mode
/Za, /Ze (Disable Language Extensions)
The /Za compiler option disables and emits errors for Microsoft extensions to C that aren't compatible with ANSI C89/ISO C90. The deprecated /Ze compiler option enables Microsoft extensions. Microsoft extensions are enabled by default.
See Enforce ANSI C Standard in Visual Studio 2015
Most other compilers use other options like -ansi, -std=c90 or -std=iso9899:1990
However if this is just about stdin/stdout not being constant while using in a static initializer list then it's completely irrelevant to C89 and is actually an XY problem. The following snippet compiles without problem in VS2019 C++ mode, so if you don't have any possible conflict just compile the code in C++ mode
#include <stdio.h>
FILE* ifp = stdout;
int main()
{
fprintf(ifp, "test\n");
return 0;
}
Otherwise it's easy to fix that to compile in C mode by moving the initialization into main()
FILE* ifp = NULL;
int main()
{
ifp = stdout;
fprintf(ifp, "test\n");
return 0;
}
[This isn't really an answer, but it's too elaborate for a comment.]
If you've got code that does things like
#include <stdio.h>
FILE *ifp = stdin;
int main() { ... }
and if the problem you're having is errors stating that stdin is not a compile-time constant suitable for a static initializer, I think you're going to have to rewrite that aspect of your code. I could be wrong, but if I remember correctly, the idea that stdin et al. were compile-time constants was never a guarantee, just a useful property of the earliest Unix implementations. It wasn't necessarily true of all old implementations, so the "change" to the Standard that explicitly said they weren't necessarily constant wasn't a change per se, but rather, more or less a codification of the divergence of existing practice.
(In other words, if you've got a compiler that's rejecting the code, and even if it has a backwards-compatibility mode, I'd be surprised if the backwards-compatibility mode turned stdin into a compile-time constant.)
All supported (and even older) versions of Visual Studio are perfectly capable of compiling C89 code. Also C99 is backward compatible with previous revisions of the language, so a C99 compiler should be able to compile just fine C89 code.
Although you might get some warnings, the code should compile and work just fine if the code is portable of course.

GCC how to stop false positive warning implicit-function-declaration for functions in ROM?

I want to get rid of all implicit-function-declaration warnings in my codebase. But there is a problem because some functions are
programmed into the microcontroller ROM at the factory and during linking a linker script provides only the function address. These functions are called by code in the SDK.
During compilation gcc of course emits the warning implicit-function-declaration. How can I get rid of this warning?
To be clear I understand why the warning is there and what does it mean. But in this particular case the developers of SDK guarantee that the code will work with implicit rules (i.e. implicit function takes only ints and returns an int). So this warning is a false positive.
This is gnu-C-99 only, no c++.
Ideas:
Guess the argument types, write a prototype in a header and include that?
Tell gcc to treat such functions as false positive with some gcc attribute?
You can either create a prototype function in a header, or suppress the warnings with the following:
#pragma GCC diagnostic push
#pragma GCC diagnostic ignored "-Wimplicit-function-declaration"
/* line where GCC complains about implicit function declaration */
#pragma GCC diagnostic pop
Write a small program that generates a header file romfunctions.h from the linker script, with a line like this
int rom_function();
for each symbol defined by the ROM. Run this program from your Makefiles. Change all of the files that use these functions to include romfunctions.h. This way, if the linker script changes, you don't have to update the header file by hand.
Because most of my programming expertise was acquired by self-study, I intentionally have become somewhat anal about resolving non-fatal warnings, specifically to avoid picking up bad coding habits. But, this has revealed to me that such bad coding habits are quite common, even from formally trained programmers. In particular, for someone like me who is also anal about NOT using MS Windows, my self-study of so-called platform-independent code such as OpenGL and Vulkan has revealed a WORLD of bad coding habits, particularly as I examine code written assuming the student was using Visual Studio and a Windows C/C++ compiler.
Recently, I encountered NUMEROUS non-fatal warnings as I designed an Ubuntu Qt Console implementation of an online example of how to use SPIR-V shaders with OpenGL. I finally threw in the towel and added the following lines to my qmake .PRO file to get rid of the non-fatal-warnings (after, first, studying each one and convincing myself it could be safely ignored) :
QMAKE_CFLAGS += -Wno-implicit-function-declaration
-Wno-address-of-packed-member
[Completely written due to commends]
You are compiling the vendor SDK with your own code. This is not typically what you want to do.
What you do is you build their SDK files with gcc -c -Wno-implicit-function-declaration and and your own files with gcc -c or possibly gcc -o output all-your-c-files all-their-o-files.
C does not require that declarations be prototypes, so you can get rid of the problem (which should be a hard error, not a warning, since implicit declarations are not valid C) by using a non-prototype declaration, which requires only knowing the return type. For example:
int foo();
Since "implicit declarations" were historically treated as returning int, you can simply use int for all of them.
If you are using C program, use
#include <stdio.h>

Force a C compiler to produce integer narrowing warning

Let's consider the following example:
#include <stdio.h>
void func(unsigned char c) {
printf("0x%x\n", c);
}
int main() {
int val = 0x11223344;
func(val);
}
To my best knowledge, there is no way I can force gcc nor clang, to show a warning on the statement func(val) about the narrowing int -> unsigned char that will happen there. Not even by compiling with -Wall -Wextra -pedantic. The question targets mainly C code, but it is worth including the C++ world in the discussion as well (see the note below).
C++ Note
I'm well aware that in C++ exists a kind-of workaround using the uniform initialization syntax:
func({val});
But that does not solve my problem because:
for preexisting code, it requires changes
for new code, it would require using {} everywhere
Question 1
Is there any arcane option to achieve that when compiling C or C++ code? I'm fine also with a non-standard solution as long as it works with gcc or clang and it does not require changing the code. Note: I'm not looking for tricky C++ solutions using custom integer types with or without macros that wrap primitive types. I'm looking for something like a command-line option or a pragma. Again, the question is mostly for C code, but it's worth exploring any C++ solutions too.
Question 2 (fall-back)
If the reality turns out to be that (as suspected) no such solution exists, I'd be super-curious to understand why. I can't believe that such an option was just never considered to be implemented. There should be a list of reasonable arguments against it, that I just can't think of. But the thing is that the option could be simply non-standard like -fwrapv and people could use it only where it is really needed.
Is -Wconversion what you're looking for?
You can see the behavior here, with a lot of cases.

getchar_unlocked() implicit declaration in C99

Using getchar_unlocked and compiling with --std=c99 flag gives warningas follows-
warning: implicit declaration of function ‘getchar_unlocked’ [-Wimplicit-function-declaration]
Does not give any warning if compiled without flag.Is there any way to work around with it ?
Starting from C99 you must have a visible function prototype before calling a function. While the earlier C standard would just stupidly assume that any function unknown to the compiler has the format int func (params), which in turn would cause severe bugs most of the time.
Properly declare a prototype for getchar_unlocked and the bug will go away.
Note that there is no such function present in any standard library. It seems you might have to include some non-standard library for the compiler to find the function.
_unlocked versions of get... functions are POSIX extensions. They are not part of the standard functions of C99. The full list of get... functions is given in 7.19.1.5: getwc, getwchar, getc, getchar, and gets (deprecated).
When the function is not on this list, C99-compliant compiler must warn you that your program may not compile with other C99-compliant compilers.
Dialect selection options like -ansi and -std=c99 cause the compiler to define certain macros (in addition to altering the accepted dialect).
Library header files react to those macros.
Precisely how they react is quite system-dependent (the compiler doesn't provide a C library), but a common behavior you can broadly expect is that if you use one of these flags alone (without any other "feature selection macro"), it has the effect of hiding the declarations of functions, macros and other global symbols which are not in the specified ISO C dialect.
ISO C knows nothing about getchar_unlocked. The presence of such a declaration in <stdio.h> (normally an ISO C header) is a POSIX extension, which is basically nonconforming, since getchar_unlocked is an identifier that strictly conforming C programs can use, even if they include <stdio.h>. When you use -ansi or -std=c99, the <stdio.h> header listens up and whips itself into ISO-C-conforming shape, hiding such extensions.
On well-behaved POSIX systems, you can request that you want an ISO C dialect and that you want certain rudimentary 1990-ish POSIX features to be visible in header files, for instance like this:
gcc -std=c99 -D_POSIX_SOURCE ...
^^^^^ "feature selection macro"
There is a whole science to these feature selection macros, too broad for this question and answer; some forms of them have values, like -D_XOPEN_SOURCE=500. _POSIX_SOURCE doesn't need an argument; it is just defined or not, but _POSIX_C_SOURCE is numeric.
I just checked glibc and Cygwin: on both, _POSIX_SOURCE is enough to reveal the getchar_unlocked declaration. It is quite old, dating back to POSIX.1 1996.
Beware: on some systems, multiple feature selection macros don't play along reasonably; they give you a set intersection rather than union, so that -D_POSIX_SOURCE and -D_BSD_SOURCE together end up meaning "Declare to me only those handful of functions that are specific to classic BSD that have been standardized in POSIX too", which means that next to nothing is declared.
getchar_unlocked is not a C standard function.
Compiling it forcing c99 standard does not support it natively.

restrict qualifier compilation error

I am using Code::Blocks 10.05, and mingw. It seems the compiler does not recognized restrict qualifier and return "error: expected ';', ',' or ')' before 'src'". Do I need to pass any compiler option in order to compile it correctly?
int inet_pton4 (const char *restrict src, unsigned char *restrict dst)
p/s: it seems mingw does not support inet_pton4, so i tried to integrate an open-source version into my code.
If your compiler does not support the restrict keyword, just take that keyword out (a).
It's used to indicate to the compiler that you (the developer) promise that the pointers follow certain properties involving aliasing, and this, in turn, allows the compiler to perform certain optimisations that would otherwise not necessarily be safe.
If you leave off that keyword in a compiler that supports it, it prevents those optimisations (slight downside).
If you leave it off for compilers that don't support that keyword, the downside is nil (since they don't support those optimisations anyway) and the upside is considerable, as in "it will compile for you" :-)
(a) You may want to ensure you're compiling in C99 mode first. While it may be true that you're using an older gcc that doesn't understand restrict, it's equally possible that you're not compiling in C99 mode, such as with -std=c99 (gcc docs seem to indicate that restrict has been supported even back to version 3.0).
If, for some reason you cannot activate C99 mode, I think gcc has an extension that allows you to use __restrict.
Since restrict is new in C99, and since, as #paxdiablo points out, omitting the restrict keyword doesn't really hurt anything, you can do this:
#if __STDC_VERSION__ < 199901L
#define restrict /* nothing */
#endif
Put this in a header that's #included by everything in your project (or at least by everything that uses restrict).
This should let your code compile with any C compiler, whether it supports C99 or not. It should even work for a compiler that doesn't define __STDC_VERSION__.
But, since you're using MinGW, which uses gcc, using gcc --std=c99 should also solve the problem (as #paxdiablo also points out).
You can safely do both. (And the #if solution is likely to be useful to others.)

Resources