I'm trying to install the NVIDIA version of an externally supplied toolkit (for the purposes of this message it doesn't matter what the toolkit is, this is a problem about how to use nvcc), and I'm getting error messages like "usr/include/c++/6/utility(329): error: this declaration may not have extern "C" linkage".
I'm not a C or C++ programmer, but I am happy enough poking around in things like Makefiles. I'm pretty sure that I've got all the paths set to point to the right places, and /usr/include/c++/6 contains all the files that are generating the error messages. But I have no idea what these error messages mean and what I should do to get round them.
I believe these errors are from C/C++ name mangling differences and the fact that NVCC compiles as a C++ compiler. I was able to compile HTK 3.5 by simply removing the extern "C" declarations from HCUDA.cu:
#ifdef __cplusplus
extern "C" {
#endif
/* ... */
#ifdef __cplusplus
}
#endif
I suspect that, because they're already declared extern "C" in HCUDA.h, they don't need to be declared such in HCUDA.cu, but I'm not sure.
A solution has a project consisting of a static library (one .h file, one .c file) written in C, and a utility program (.cxx) based on that library written in C++.
The library compiles without error. The utility compiles too, but fails in linkage with errors like:
1>abc.obj : error LNK2019: unresolved external symbol "struct DEFListen * __cdecl DEFisten(int,char *,int)" (?DEFListen##YAPAU0#HPADH#Z) referenced in function _main
The header for the library includes extern "C" guards:
#ifdef __cplusplus
extern "C" {
#endif
By putting garbage inside the ifdef, I get compile warnings when compiling (not linking) the utility as expected, so I know __cplusplus is in fact defined when the utility is compiled, and not for instance merely misspelled.
Yet, the error message shows the function signature (and the mangled name) for the function in question. dumbin /symbols on the utility's object file of course confirms the object contains mangled symbols.
In summary: extern "C" { is definitely being parsed at compile time yet ignored. Why?
Problem is so stupid: is that the closing brace code (below) had been cut and pasted to an alternate location, while re-arranging declarations, leaving several functions outside the extern block.
#ifdef __cplusplus
}
#endif
Sorry I didn't spot that before posting for help.
I leave this shameful post to found by someone equally foolish (possibly me) in the future.
In my case I am writing a simple plugin system in C using dlfcn.h (linux). The plugins are compiled separately from the main program and result in a bunch of .so files.
There are certain functions that must be defined in the plugin in order for the the plugin to be called properly by the main program. Ideally I would like each plugin to have included in it a .h file or something that somehow states what functions a valid plugin must have, if these functions are not defined in the plugin I would like the plugin to fail compilation.
I don't think you can enforce that a function be defined at compile time. However, if you use gcc toolchain, you can use the --undefined flag when linking to enforce that a symbol be defined.
ld --undefined foo
will treat foo as though it is an undefined symbol that must be defined for the linker to succeed.
You cannot do that.
It's common practice, to only define two exported functions in a library opened by dlopen(), one to import functions in your plugin and one to export functions of your plugin.
A few lines of code are better than any explanation:
struct plugin_import {
void (*draw)(float);
void (*update)(float);
};
struct plugin_export {
int (*get_version)(void);
void (*set_version)(int);
};
extern void import(struct plugin_import *);
extern void export(struct plugin_export *);
int setup(void)
{
struct plugin_export out = {0};
struct plugin_import in;
/* give the plugin our function pointers */
in.draw = &draw, in.update = &update;
import(&in);
/* get our functions out of the plugin */
export(&out);
/* verify that all functions are defined */
if (out.get_version == NULL || out.set_version == NULL)
return 1;
return 0;
}
This is very similar to the system Quake 2 used. You can look at the source here.
With the only difference, Quake 2 only exported a single function, which im- and exports the functions defined by the dynamic library at once.
Well after doing some research and asking a few people that I know of on IRC I have found the following solution:
Since I am using gcc I am able to use a linker script.
linker.script:
ASSERT(DEFINED(funcA), "must define funcA" ) ;
ASSERT(DEFINED(funcB), "must define funcB" ) ;
If either of those functions are not defined, then a custom error message will be output when the program tries to link.
(more info on linker script syntax can be found here: http://www.math.utah.edu/docs/info/ld_3.html)
When compiling simply add the linker script file after the source file:
gcc -o test main.c linker.script
Another possibility:
Something that I didn't think of (seems a bit obvious now) that was brought to my attention is you can create small program that loads your plugin and checks to see that you have valid function pointers to all of the functions that you want your plugin to have. Then incorporate this into your build system, be it a makefile or a script or whatever. This has the benefit that you are no longer limited to using a particular compiler to make this work. As well as you can do some more sophisticated checks for other other things. The only downside being you have a little more work to do to get it set up.
Wy redefinition of function already present in dynamic library does not throws any compilation and linking error?
In the below function
#include "calc_mean.h"
#include <stdio.h>
int mean(int t, int v) {
return 0;
}
int main () {
int theMean = mean(3,6);
printf("\n %d\n",theMean);
}
Inside the shared library Definition of mean function already present as below.
#include <stdio.h>
#include "calc_mean.h"
int mean(int a, int b) {
return (a+b)/2;
}
The definition of mean function is already present in the shared library libmean.so. But during compilation I don't see any redefinition error and compilation is successful.
And on successful execution the o/p I see is 0 instead of 4 so the function definition of mean inside the shared library is not getting executed but the one inside the main module is getting executed.
Why is this happening so?
The linker only links in a function from a library if the function had not yet been found during the compilation/linking process.
The reason for the difference in functionality is that there are different types of symbols. A library function is a weak symbol. It is only included if it is not already defined. nm is a tool for listing the symbols in an object or executable. In its man-page you can find a list of the types of symbols.
There is also a wikipedia page on weak symbols.
Having two definitions of one externally-visible function (even if the definitions are identical, for non-inline functions) causes undefined behaviour, with no diagnostic required. (Ref: C99 6.9#5 and Annex J.2)
In C, some illegal code requires a compiler diagnostic and some doesn't. Typically the ones that do not require a diagnostic are because:
it would be considered too prohibitive to require all compilers to detect and report the error
there were existing systems in use that did not diagnose it and the Standard committee did not want to render an existing implementation non-conforming.
In this case, my guess would be that this is a case of the first one; they wanted to leave open the option for compilers/linkers to implement weak symbols as an extension, so they did not specify that the compiler must give a warning here. Or possibly it is actually difficult to detect this in general, I've never tried to write a linker!
It should be considered a quality-of-implementation issue if no diagnostic is given. Perhaps it is possible to pass different flags to your linker so that it does reject this code; if not then you could put a in bug report or a feature request.
Did you link correctly the shared library because the compiler should give the error :
multiple definition of 'mean'
(I found this question which is similar but not a duplicate:
How to check validity of header file in C programming language )
I have a function implementation, and a non-matching prototype (same name, different types) which is in a header file. The header file is included by a C file that uses the function, but is not included in the file that defines the function.
Here is a minimal test case :
header.h:
void foo(int bar);
File1.c:
#include "header.h"
int main (int argc, char * argv[])
{
int x = 1;
foo(x);
return 0;
}
File 2.c:
#include <stdio.h>
typedef struct {
int x;
int y;
} t_struct;
void foo (t_struct *p_bar)
{
printf("%x %x\n", p_bar->x, p_bar->y);
}
I can compile this with VS 2010 with no errors or warnings, but unsurprisingly it segfaults when I run it.
The compiler is fine with it (this I understand)
The linker did not catch it (this I was slightly surprised by)
The static analysis tool (Coverity) did not catch it (this I was very surprised by).
How can I catch these kinds of errors?
[Edit: I realise if I #include "header.h" in file2.c as well, the compiler will complain. But I have an enormous code base and it is not always possible or appropriate to guarantee that all headers where a function is prototyped are included in the implementation files.]
Have the same header file included in both file1.c and file2.c. This will pretty much prevent a conflicting prototype.
Otherwise, such a mistake cannot be detected by the compiler because the source code of the function is not visible to the compiler when it compiles file1.c. Rather, it can only trust the signature that has been given.
At least theoretically, the linker could be able to detect such a mismatch if additional metadata is stored in the object files, but I am not aware if this is practically possible.
-Werror-implicit-function-declaration, -Wmissing-prototypes or equivalent on one of your supported compilers. then it will either error or complain if the declaration does not precede the definition of a global.
Compiling the programs in some form of strict C99 mode should also generate these messages. GCC, ICC, and Clang all support this feature (not sure about MS's C compiler and its current status, as VS 2005 or 2008 was the latest I've used for C).
You may use the Frama-C static analysis platform available at http://frama-c.com.
On your examples you would get:
$ frama-c 1.c 2.c
[kernel] preprocessing with "gcc -C -E -I. 1.c"
[kernel] preprocessing with "gcc -C -E -I. 2.c"
[kernel] user error: Incompatible declaration for foo:
different type constructors: int vs. t_struct *
First declaration was at header.h:1
Current declaration is at 2.c:8
[kernel] Frama-C aborted: invalid user input.
Hope this helps!
Looks like this is not possible with C compiler because of its way how function names are mapped into symbolic object names (directly, without considering actual signature).
But this is possible with C++ because it uses name mangling that depends on function signature. So in C++ void foo(int) and void foo(t_struct*) will have different names on linkage stage and linker will raise error about it.
Of course, that will not be easy to switch a huge C codebase to C++ in turn. But you can use some relatively simple workaround - e.g. add single .cpp file into your project and include all C files into it (actually generate it with some script).
Taking your example and VS2010 I added TestCpp.cpp to project:
#include "stdafx.h"
namespace xxx
{
#include "File1.c"
#include "File2.c"
}
Result is linker error LNK2019:
TestCpp.obj : error LNK2019: unresolved external symbol "void __cdecl xxx::foo(int)" (?foo#xxx##YAXH#Z) referenced in function "int __cdecl xxx::main(int,char * * const)" (?main#xxx##YAHHQAPAD#Z)
W:\TestProjects\GenericTest\Debug\GenericTest.exe : fatal error LNK1120: 1 unresolved externals
Of course, this will not be so easy for huge codebase, there can be other problems leading to compilation errors that cannot be fixed without changing codebase. You can partially mitigate it by protecting .cpp file contents with conditional #ifdef and use only for periodical checks rather than for regular builds.
Every (non-static) function defined in every foo.c file should have a prototype in the corresponding foo.h file, and foo.c should have #include "foo.h". (main is the only exception.) foo.h should not contain prototypes for any functions not defined in foo.c.
Every function should prototyped exactly once.
You can have .h files with no corresponding .c files if they don't contain any prototypes. The only .c file without a corresponding .h file should be the one containing main.
You already know this, and your problem is that you have a huge code base where this rule has not been followed.
So how do you get from here to there? Here's how I'd probably do it.
Step 1 (requires a single pass over your code base):
For each file foo.c, create a file foo.h if it doesn't already exist. Add "#include "foo.h" near the top of foo.c. If you have a convention for where .h and .c files should live (either in the same directory or in parallel include and src directories, follow it; if not, try to introduce such a convention).
For each function in foo.c, copy its prototype to foo.h if it's not already there. Use copy-and-paste to ensure that everything stays consistent. (Parameter names are optional in prototypes and mandatory in definitions; I suggest keeping the names in both places.)
Do a full build and fix any problems that show up.
This won't catch all your problems. You could still have multiple prototypes for some functions. But you'll have caught any cases where two headers have inconsistent prototypes for the same function and both headers are included in the same translation unit.
Once everything builds cleanly, you should have a system that's at least as correct as what you started with.
Step 2:
For each file foo.h, delete any prototypes for functions that aren't defined in foo.c.
Do a full build and fix any problems that show up. If bar.c calls a function that's defined in foo.c, then bar.c needs a #include "foo.h".
For both of these steps, the "fix any problems that show up" phase is likely to be long and tedious.
If you can't afford to do all this at once, you can probably do a lot of it incrementally. Start with one or a few .c files, clean up their .h files, and remove any extra prototypes declared elsewhere.
Any time you find a case where a call uses an incorrect prototype, try to figure out the circumstances in which that call is executed, and how it causes your application to misbehave. Create a bug report and add a test to your regression test suite (you have one, right?). You can demonstrate to management that the test now passes because of all the work you've done; you really weren't just messing around.
Automated tools that can parse C are likely to be useful. Ira Baxter has some suggestions. ctags may also be useful. Depending on how your code is formatted, you can probably throw together some tools that don't require a full C parser. For example, you might use grep, sed, or perl to extract a list of function definitions from a foo.c file, then manually edit the list to remove false positives.
Its obvious ("I have a huge code base") you cannot do this by hand.
What you need is an automated tool that can read your source files as the compiler sees them, collect all function prototypes and definitions, and verify that all definitions/prototypes match. I doubt you'll find such a tool lying around.
Of course, this match much check the signature, and this requires something like the compiler's front end to compare the signatures.
Consider
typedef int T;
void foo(T x);
in one compilation unit, and
typedef float T;
void foo(T x);
in another. You can't just compare the signature "lines" for equality; you need something that can resolve the types when checking.
GCCXML may be able to help, if you are using a GCC dialect of C; it extracts top-level declarations from source files as XML chunks. I don't know if it will resolve typedefs, though. You obviously have to build (considerable) support to collect the definitions in a central place (a database) and compare them. Comparing XML documents for equivalents is at least reasonably straightforward, and pretty easy if they are formatted in a regular way. This is likely your easiest bet.
If that doesn't work, you need something that has a full C front end that you can customize. GCC is famously available, and famously hard to customize. Clang is available, and might be pressed into service for this, but AFAIK only works with GCC dialects.
Our DMS Software Reengineering Toolkit has C front ends (with full preprocessing capability) for many dialects of C (GCC, MS, GreenHills, ...) and builds symbol tables with complete type information. Using DMS you might be able (depending on the real scale of your application) to simply process all the compilation units, and build just the symbol tables for each compilation unit. Checking that symbol table entries "match" (are compatible according to compiler rules including using equivalent typedefs) is built-into the C front ends; all one needs to do is orchestrate the reading, and calling the match logic for all symbol table entries at global scope across the various compilation units.
Whether you do this with GCC/Clang/DMS, it is a fair amount of work to cobble together a custom tool. So you have decide how critical you need for fewer suprises is, compared to the energy to build such a custom tool.