I'm using PC-Lint to lint a C project. I want to ignore errors and warnings in third party libraries, but I'm not able to get this. Reading the manual, I check all #include files specified with angle brackets are considered as libraries.
[...] and you want this header to be regarded as a library header use angle brackets
as in: #include <\include\graph.h>
Or for example, using the -libh command to indicate that header file is a library.
Using the option -vf, I've verified that my library files are being included as libraries. So everithing is OK.
The problems is that I'm receiving lot of errors from these files. I thought that since these files are considered as libraries, errors would be ignored.
How can ignore errors in library files? I've tried with -wlib(0), but this option ignore errors in header files too. In addition, generates an umcofortable warning:
Warning 686: Option '-wlib(0)' is suspicious because
of 'the likelihood of causing meaningless output'; receiving a syntax error
in a library file most likely means something is wrong with your Lint
confinguration
Any suggestion? Thanks in advance
I had to read several times the PC-Lint manual and check the output log several times. The "problem" is by default the expression
+libclass(angle, foreign)
is enabled, so all .h files are considered libraries. It is necessary to overwrite this expression by using:
+libclass(angle)
In order to treat these files as headers an not libraries.
Thanks again.
Sorry for posting late but I found this thread when looking for ways to remove the -wlib(0) warning in the output.
So for others looking for that answer a simple -e686 before the -wlib(0) removes that warning from the output.
I understand this does not answer the original question, but sometimes this is what you want to do.
Related
I like to keep my files clean, so I prefer to take out includes I don't need. Lately I've been just commenting the includes out and seeing if it compiles without warnings (-Wall -Wextra -pedantic, minus a couple very specific ones). I figure if it compiles without warnings I didn't need it.
Is this actually a safe way to check if an include is needed or can it introduce UB or other problems? Are there any specific warnings I need to be sure are enabled to catch potential problems?
n.b. I'm actually using Objective C and clang, so anything specific to those is appreciated, but given the flexibility of Objective C I think if there's any trouble it will be a general C thing. Certainly any problems in C will affect Objective C.
In principle, yes.
The exception would be if two headers interact in some hidden way. Say, if you:
include two different headers which define the same symbol differently,
both definitions are syntactically valid and well-typed,
but one definition is good, the other breaks your program at run-time.
Hopefully, your header files are not structured like that. It's somewhat unlikely, though not inconceivable.
I'd be more comfortable doing this if I had good (unit) tests.
Usually just commenting out the inclusion of the header is safe, meaning: if the header is needed then there will be compiler errors when you remove it, and (usually) if the header is not needed, the code will still compile fine.
This should not be done without inspecting the header to see what it adds though, as there is the (not exactly typical) possibility that a header only provides optional #define's (or #undef's) which will alter, but not break, the way a program is compiled.
The only way to be sure is to build your code without the header (if it's able to build in the first place) and run a proper regimen of testing to ensure its behavior has not changed.
No. Apart from the reasons already mentioned in other answers, it's possible that the header is needed and another header includes it indirectly. If you remove the #include, you won't see an error but there may be errors on other platforms.
In general, no. It is easy to introduce silent changes.
Suppose header.h defines some macros like
#define WITH_FEATURE_FOO
The C file including header.h tests the macro
#ifdef WITH_FEATURE_FOO
do_this();
#else
do_that();
#endif
Your files compile cleanly and with all warnings enabled with or without the inclusion of header.h, but the result behaves differently. The only way to get a definitive answer is to analyze which identifiers a header defines/declares and see if at least one of them appears in the preprocessed C file.
One tool that does this is FlexeLint from Gimpel. I don't get paid for saying this, even though they should :-) If you want to avoid shelling out big bucks, an approach I have been taking is compiling a C file to an object file with and without the header, if both succeed check for identical object files. If they are the same you don't need the header
(but watch our for include directives wrapped in #ifdefs that are enabled by a -DWITH_FEATURE_FOO option).
I am writing a program in C, under XCode 3.2.5, under Snow Leopard (10.6.8).
I have compiled, made etc. GMP a number of times, and I think (or hope!) I have that aspect under control. I am installing it in a sub-directory under /Developer/usr, and I have added search paths (and the -lgmp entry) as per the advice I have found on the www. (I have put in usr/local entries before my install path, just in case that helps somehow.)
I got an error as follows: “ld: warning: in /Developer/usr/gmp-5.1.2_core2-apple-darwin10.8.0/lib/libgmp.dylib, file was built for unsupported file format which is not the architecture being linked (i386)”.
When I compile etc. for the platform that the compiler chooses (being core2-apple-darwin10.8.0) — either explicitly or by letting it choose — I get the above message.
When I compile etc., specifying the “build” and “host” as i386 (as per the above error message), I get no build errors. However…
The functions appear to be doing nothing. For instance, the following line
blCheckSetStr = mpz_init_set_str (zNumber, "85", 10);
appears to set zNumber to 1. However, when I refer to it later, I get the error message “EXC_BAD_ACCESS”.
For what I assume to be the same reason, I have found several commands that do not work, and have substituted other commands.
I would really appreciate any help, please. Thank you in advance.
! ! p.s. I should have mentioned: in my program (in C), I have been unable to use functions; I worked around this by just including everything in main. I am guessing that this is probably related to why the GMP functions are not working. (Sorry; I forgot about it because it was no longer an issue for me.)
Remembering the names of system header files is a pain...
Is there a way to include all existing header files at once?
Why doesn't anyone do that?
Including unneeded header files is a very bad practice. The issue of slowing down compilation might or might not matter; the bigger issue is that it hides dependencies. The set of header files you include in a source file should is the documentation of what functionality the module depends upon, and unlike external documentation or comments, it is automatically checked for completeness by the compiler (failing to include needed header files will result in an error). Ensuring the absence of unwanted dependencies not only improves portability; it also helps you track down unneeded and potentially dangerous interactions, for instance cases where a module which should be purely computational or purely data structure management is accessing the filesystem.
These principles apply whether the headers are standard system headers or headers for modules within your own program or third-party libraries.
Your source code files are preprocessed before the compiler looks at them, and the #include statement is one of the directives that the preprocessor uses. When being preprocessed, #include statements are replaced with the entire contents of the file being included. The result of including all of the system files would be very large source files that the compiler then needs to work through, which will cost a lot of time during compilation.
No one includes all the header files. There are too many, and a few of them are mutually exclusive with other files (like ncurses.h and curses.h).
It really is not that bad when writing a program even from scratch. A few are quite easy to remember: stdio.h for any FILE stuff; ctype.h for any character classification, alloc.h for any use of malloc(), etc.
If you don't remember one:
leave the #include out
compile
examine first few error messages for indication of a missing header file, such as some type not declared, or calling a function with assumed parameter types
figure out which function call is the cause
look at the man page (or whatever documentation your compiler has) for that function
notice the #include shown by the documentation and add it
repeat until all errors fixed
It is quite a bit easier for adding to an existing code base. You could go hundreds or thousands of working hours and never have to add a #include.
No it is a terrible idea and will massively increase your compile times and possible make your exe a lot larger by including massive amounts of unused code.
I know what you're talking about, but I need to double-check the function prototypes for the functions I'm using (for ones I don't use daily, anyway) -- I'll just copy and paste the #includes straight out of the manpage for the associated functions. I'm already looking at the manpage (it's a simple K in vim(1)), so it doesn't feel like an extra burden.
You can create a "master" header, where you put all your includes into. Then in everything else include it! Beware of conflicting definitions and circular references... So.... Master1.h, master2.h, ...
Not advocating it. Just saying.
i've been working for some time with an opensource library ("fast artificial neural network"). I'm using it's source in my static library. When i compile it however, i get hundreds of linker warnings which are probably caused by the fact that the library includes it's *.c files in other *.c files (as i'm only including some headers i need and i did not touch the code of the lib itself).
My question: Is there a good reason why the developers of the library used this approach, which is strongly discouraged? (Or at least i've been told all my life that this is bad and from my own experience i believe it IS bad). Or is it just bad design and there is no gain in this approach?
I'm aware of this related question but it does not answer my question. I'm looking for reasons that might justify this.
A bonus question: Is there a way how to fix this without touching the library code too much? I have a lot of work of my own and don't want to create more ;)
As far as I see (grep '#include .*\.c'), they only do this in doublefann.c, fixedfann.c, and floatfann.c, and each time include the reason:
/* Easy way to allow for build of multiple binaries */
This exact use of the preprocessor for simple copy-pasting is indeed the only valid use of including implementation (*.c) files, and relatively rare. (If you want to include some code for another reason, just give it a different name, like *.h or *.inc.) An alternative is to specify configuration in macros given to the compiler (e.g. -DFANN_DOUBLE, -DFANN_FIXED, or -DFANN_FLOAT), but they didn't use this method. (Each approach has drawbacks, so I'm not saying they're necessarily wrong, I'd have to look at that project in depth to determine that.)
They provide makefiles and MSVS projects which should already not link doublefann.o (from doublefann.c) with either fann.o (from fann.c) or fixedfann.o (from fixedfann.c) and so on, and either their files are screwed up or something similar has gone wrong.
Did you try to create a project from scratch (or use your existing project) and add all the files to it? If you did, what is happening is each implementation file is being compiled independently and the resulting object files contain conflicting definitions. This is the standard way to deal with implementation files and many tools assume it. The only possible solution is to fix the project settings to not link these together. (Okay, you could drastically change their source too, but that's not really a solution.)
While you're at it, if you continue without using their project settings, you can likely skip compiling fann.c, et. al. and possibly just removing those from the project is enough – then they won't be compiled and linked. You'll want to choose exactly one of double-/fixed-/floatfann to use, otherwise you'll get the same link errors. (I haven't looked at their instructions, but would not be surprised to see this summary explained a bit more in-depth there.)
Including C/C++ code leads to all the code being stuck together in one translation unit. With a good compiler, this can lead to a massive speed boost (as stuff can be inlined and function calls optimized away).
If actual code is going to be included like this, though, it should have static in most of its declarations, or it will cause the warnings you're seeing.
If you ever declare a single global variable or function in that .c file, it cannot be included in two places which both compile to the same binary, or the two definitions will collide. If it is included in even one place, it cannot also be compiled on its own while still being linked into the same binary as its user.
If the file is only included in one place, why not just make it a discrete compilation unit (and use its globals via extern declarations)? Why bother having it included at all?
If your C files declare no global variables or functions, they are header files and should be named as such.
Therefore, by exhaustive search, I can say that the only time you would ever potentially want to include C files is if the same C code is used in building multiple different binaries. And even there, you're increasing your compile time for no real gain.
This is assuming that functions which should be inlined are marked inline and that you have a decent compiler and linker.
I don't know of a quick way to fix this.
I don't know that library, but as you describe it, it is either bad practice or your understanding of how to use it is not good enough.
A C project that wants to be included by others should always provide well structured .h files for others and then the compiled library for linking. If it wants to include function definitions in header files it should either mark them as static (old fashioned) or as inline (possible since C99).
I haven't looked at the code, but it's possible that the .c or .cpp files being included actually contain code that works in a header. For example, a template or an inline function. If that is the case, then the warnings would be spurious.
I'm doing this at the moment at home because I'm a relative newcomer to C++ on Linux and don't want to get bogged down in difficulties with the linker. But I wouldn't recommend it for proper work.
(I also once had to include a header.dat into a C++ program, because Rational Rose didn't allow headers to be part of the issued software and we needed that particular source file on the running system (for arcane reasons).)
I have two large framework libraries, whose header files are included in my project. Either one works flawlessly, but including both causes erratic behaviour (but no error message related to the macro).
I assume that they both #define a macro of the same name. What is the most efficient way to identify the problematic macro?
I assume that they both #define a macro of the same name.
That should generate at least a warning by the compiler (if they are in the same translation unit).
How do I identify redefined macros in C/C++?
As far as I know there is no straight-forward way.
Either one works flawlessly, but including both causes erratic behaviour
Can you please give us some details on the eratic behaviour? What actually happens? What makes you think it's the macro definitions?
If the header files are badly written and #undef SYMBOL ... #define SYMBOL something-else then you can't trace this. Simply #defining a macro twice should at least issue a warning. You'd have to look more closely at the 'erratic behavior'.
Try looking at the preprocessed output to determine what's different about it when you #include the header files and when you don't. With gcc and with MSVC, you'd compile with -E to print the preprocessor output to stdout. (It likely will be many thousands of lines, so you'll want to pipe it to a file.)
You should be able to run ctags over your source.
ctags can generate a tags file that, amongst other things, contains the names and locations of the macros in your C files.
You can control the types of symbols that ctags will store in your tags file through the use of the --c-kinds option.
eg.
ctags --c-kinds=+d -f tags --recurse ./your_source_directory/
You can then search for duplicates in the resultant tags file.
grep for #define ?
are you sure the problem isn't something else than a macro (for example pragmas for structur packing, global memory allocators, global namespace for class names, messing with locale ...)
Compile with all warnings on - they should tell you when a macro 'is already defined' (maybe you can modify code in order to fix this)
If (1) doesn't help then you should try to create function wrappers for each library. This way you avoid including the conflicting headers by including the wrapped headers, using the wrapped functions. This is laborious but it's sometimes the only way to make two libraries coexist in an application.
Basically solution (2) would make a separation between libraries. An example of such conflict is ACE with wxWidgets (2.8 version) when forced using precompiled libraries that are compiled with different options (one library Unicode the other ASCII).