C: Linking functions works without sharing headers or extern declaration - c

I am currently "playing" around in a quite big and old codebase and, quite unfortunately, it has no fixed style attached to it. So it was just made to work but that also means that quite a lot of it can be described as spaghetti code.
I came across something that I do not fully undersand. Compiler is from ARM/KEIL and it is for an embedded system.
first file:
fileA.c
// prototype
int GetSomething( int a );
// implementation
int GetSomething( int a) {
DoSomething();
}
second file:
fileB.c
// prototype
int GetSomething( int a )
void main ( void ) {
GetSomething(10);
}
There are no headers which have a declaration for the function GetSomething but the function is still correctly linked. Originally, there are a extern keyword in the second file in the declaration of GetSomething, but with or without that results in the same binary. The code has been tests and works.
I've seen Stackoverflow Question but that doesn't seem to cover my case as it seems to have nothing to do with the extern keyword.
I hope that somebody can explain that to me or tell me what is going on. Thanks.

Using header files and #include directives are just a more organized and neater way to use various parts of code in a program at different places.
When you do something like #include "header.h" a copy of header.h is put into the file.
So when you write
GetSomething( int a );
you are essentially doing an alternative to what
#include would normally do.
Another important detail is that function prototypes have the extern storage class specifier by default.
One thing you should keep in mind is that declaring function prototypes across your files manually can result in error prone and hard to maintain code. So it is best to utilize header files and #include directives.

Related

is it possible to have only header file in C without source file

I would like to write a C library with fast access by including just header files without using compiled library. For that I have included my code directly in my header file.
The header file contains:
#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#ifndef INC_TEST_H_
#define INC_TEST_H_
void test(){
printf("hello\n");
}
#endif
My program doesn't compile because I have multiple reference to function test(). If I had a correct source file with my header it works without error.
Is it possible to use only header file by including code inside in a C app?
Including code in a header is generally a really bad idea.
If you have file1.c and file2.c, and in each of them you include your coded.h, then at the link part of the compilation, there will be 2 test functions with global scope (one in file1.c and the other one in file2.c).
You can use the word "static" in order to say that the function will be restricted so it is only visible in the .c file which includes coded.h, but then again, it's a bad idea.
Last but not least: how do you intend to make a library without a .so/.a file? This is not a library; this is copy/paste code directly in your project.
And when a bug is found in your "library", you will be left with no solution apart correcting your code, redispatch it in every project, and recompile every project, missing the very point of a dynamic library: The ability to "just" correct the library without touching every program using it.
If I understand what you're asking correctly, you want to create a "library" which is strictly source code that gets #incuded as necessary, rather than compiled separately and linked.
As you have discovered, this is not easy when you're dealing with functions - the compiler complains of multiple definitions (you will have the same problem with object definitions).
You have a couple of options at this point.
You could declare the function static:
static void test( void )
{
...
}
The static keyword limits the function's visibility to the current translation unit, so you don't run into multiple definition errors at link time. It means that each translation unit is creating its own separate "instance" of the function, leading to a bit of code bloat and slightly longer build times. If you can live with that, this is the easiest solution.
You could use a macro in place of a function:
#define TEST() (printf( "hello\n" ))
except that macros are not functions and do not behave like functions. While macro-based "libraries" do exist, they are not trivial to implement correctly and require quite a bit of thought. Remember that macro arguments are not evaluated, they're just expanded in place, which can lead to problems if you pass expressions with side effects. The classic example is:
#define SQUARE(x) ((x)*(x))
...
y = SQUARE(z++);
SQUARE(z++) expands to ((z++)*(z++)), which leads to undefined behavior.
Separate compilation is a Good Thing, and you should not try to avoid it. Doing everything in one source file is not scalable, and leads to maintenance headaches.
My program do not compiled because I have multiple reference to test() function
That is because the .h file with the function is included and compiled in multiple C source files. As a result, the linker encounters the function with global scope multiple times.
You could have defined the function as static, which means it will have scope only for the curent compilation unit, so:
static void test()
{
printf("hello\n");
}

Interface/Implementation in ANSI C

I'm working on a large project in C, and I want to organize it using interface (.h) and implementation (.c) files, similar to many object-oriented languages such as Objective-C or Java. I am familiar with creating static libraries in C, but I think that doing so for my project is unnecessarily complex. How can I implement an interface/implementation paradigm in ANSI C? I'm primarily using GCC for compilation, but I'm aiming for strict adherence to ANSI C and cross-compiler compatibility. Thanks!
It sounds like you are already doing the right thing: good C code also organizes interfaces in .h-files and implementations in .c-files.
Example a.h file:
void f(int a);
Example a.c file:
#include "a.h"
static void helper(void) {...}
void f(int a) {... use helper()...}
Example main.c file:
#include "a.h"
int main(void) { f(123); return 0; }
You get modularity because helper-functions are not declared in headers so other modules dont know about them (you can declare them at the top of the .c file if you want). Having this modularity reduces the number of recompiles needed and reduces how much has to be recompiled. (The linking has to be done every time though). Note that if you are not declaring helper-functions in the header then you are already pretty safe, however having the static in front of them also hides them from other modules during linking so there is no conflict if multiple modules use the same helper-function-names.
If you are working with only primitive types then that is all you need to know and you can stop reading here. However if your module needs to work with a struct then it gets just a little more complicated.
Problematic example header b.h:
typedef struct Obj {
int data;
}*Obj;
Obj make(void);
void work(Obj o);
Your module wants to pass objects in and out. The problem here is, that internals are leaked to other modules that depend on this header. If the representation is changed to float data then all using modules have to recompile. One way to fix this is to only use void*. That is how many programs do it. However that is cumbersome because every function getting the void* as argument has to cast it to Obj. Another way is to do this:
Header c.h:
typedef struct Obj*Obj;
Obj make(void);
void work(Obj);
Implementation c.c:
#include "c.h"
typedef struct Obj {
int data;
}*Obj;
The reason why this works is, that Obj is a pointer (as opposed to a struct by value/copy). Other modules that depend on this module only need to know that a pointer is being passed in and out, not what it points to.
You must read something about OOP with non OOL such like http://www.cs.rit.edu/~ats/books/ooc.pdf.
But, doing such you will never have strong OOP typing.
Please do yourself a favor and read C Interfaces and Implementations: Techniques for Creating Reusable Software
Here is a repository of mine that holds some libs written in C using the pattern of interfaces & implementation described in the book.

Multiple Inheritance in C

I am stuck with a classical Multiple Inheritance problem in C.
I have created source files Stack.c and Queue.c. Both of them #include a file Node.c (which containing functions to allocate and deallocate memory). Now, I am trying to implement another program in a single file, for which I need to include both Stack.c and Queue.c.
I tried to #include both the files, but the compiler is throwing a conflicting type error.
What is the most correct way to do so?
Thanks in advance!!
Calling this "multiple inheritance" may be confusing because multiple inheritance is an object-oriented programming issue that doesn't arise in C.
It appears to me that your difficulty may be that you are trying to #include executable code (i.e. .c files) instead of linking the .c files and #including header (.h) files that provide declarations for the functions in the .c files.
This will happen if you #include source files (.c)... you are supposed to (for the most part) #include headers (.h). Headers generally provide function prototypes, typedefs, macros, etc. but leave out the actual implementation.
The actual implementation of functions, definitions of variables, etc. should happen exactly once per-compilation unit and usually in a .c file.
If you have other code that needs to re-use functions or variables defined in another compilation unit (e.g. Stack.c), you would #include Stack.h which would provide the function prototypes, global variable names, etc. that you might need.
Once you compile all of your compilation units, it is the linker's job to figure out which object file or library a function or variable is defined in. You drastically complicate its job when you #include "X.c" in another compilation unit, because then you wind up with multiple locations for the same thing (symbols, as the linker likes to call them).
In short, use headers and let the linker do its job.
On a related note, this has nothing to do with multiple-inheritance. That is an object-oriented issue, for languages like C++. The proper name for what you are describing is "symbol collision" or "duplicate symbols".
Without seeing details of your program, it is difficult to say. However, conflicting type errors can often be fixed simply by rearranging your code or adding function prototypes.
You see, functions need to be described before they are invoked when reading the source file top to bottom. e.g.:
char foo1 ()
{
char blah = foo2();
return blah;
}
char foo2 ()
{
return 'a';
}
You'd get a conflicting type error because when it's inside foo1, it hasn't seen the declaration for foo2 yet. Thus it assumes that whatever foo2 is, it will return an int. But in actuality it returns a char. These two aren't the same, so... a conflicting type error is reported.
You can fix this by having foo2 come first in the source code, or by inserting a function prototype:
char foo2 (); // function prototype
char foo1 ()
{
char blah = foo2();
return blah;
}
char foo2 ()
{
return 'a';
}
You can also get a conflicting type error if you include source files, for the same reason. #include "Node.c" is essentially a copy-paste. It would be a good idea to switch from including Node.c to including Node.h with externalized functions. You can also avoid a lot of problems if you give prefixes to your function names in source files you plan to include, like.... nodeInsert, nodeDelete, nodeCompare, etc.

Makefile with unimplemented functions in library

First of all, I've been searching for an answer here and I haven't been able to find one. If this question is really replicated please redirect me to the right answer and I'll delete it right away. My problem is that I'm making a C library that has a few unimplemented functions in the .h file, that will need to be implemented in the main.c that calls this library. However, there is an implemented function in the library that calls them. I have a makefile for this library that gives me "undefined reference to" every function that's not implemented, so the when I try to link the .o s in the main.c file that does have those implementations I can't, because the original library wasn't able to compile because of these errors.
My question is, are there any flags that I could put in the makefile so that it will ignore the unimplemented headers or look for them once the library is linked?
This is a very old-fashioned way of writing a library (but I've worked on code written like that). It does not work well with shared libraries, as you are now discovering.
If you can change the library design
Your best bet is to rearrange the code so that the 'missing functions' are specified as callbacks in some initialization function. For example, you might currently have a header a bit like:
#ifndef HEADER_H_INCLUDED
#define HEADER_H_INCLUDED
extern int implemented_function(int);
extern int missing_function(int);
#endif
I'm assuming that your library contains implemented_function() but one of the functions in the library makes a call to missing_function(), which the user's application should provide.
You should consider restructuring your library along the lines of:
#ifndef HEADER_H_INCLUDED
#define HEADER_H_INCLUDED
typedef int (*IntegerFunction)(int);
extern int implemented_function(int);
extern IntegerFunction set_callback(IntegerFunction);
#endif
Your library code would have:
#include "header.h"
static IntegerFunction callback = 0;
IntegerFunction set_callback(IntegerFunction new_callback)
{
IntegerFunction old_callback = callback;
callback = new_callback;
return old_callback;
}
static int internal_function(int x)
{
if (callback == 0)
...major error...callback not set yet...
return (*callback)(x);
}
(or you can use return callback(x); instead; I use the old school notation for clarity.) Your application would then contain:
#include "header.h"
static int missing_function(int x);
int some_function(int y)
{
set_callback(missing_function);
return implemented_function(y);
}
An alternative to using a function like set_callback() is to pass the missing_function as a pointer to any function that ends up calling it. Whether that's reasonable depends on how widely used the missing function is.
If you can't change the library design
If that is simply not feasible, then you are going to have to find the platform-specific options to the code that builds shared libraries so that the missing references do not cause build errors. The details vary widely between platforms; what works on Linux won't work on AIX and vice versa. So you will need to clarify your question to specify where you need the solution to work.

possible flaws in 'including *.c files' style C programming

I came across some codes in the following way
//file.c
#include <stdlib.h>
void print(void){
printf("Hello world\n");
}
and
//file main.c
#include <stdio.h>
#include "file.c"
int main(int argc, char *argv[]){
print();
return EXIT_SUCCESS;
}
Is there any flaw in this kind of programming style? I am not able to make out the flaw although I feel so, because somewhere I read that separating the implementation into *.h and *.c file helps compiler check for consistency. I don't understand what is meant by consistency.
I would be deeply thankful for some suggestions.
--thanks
Nothing prevents you from including .c files. However, separating declaration (in .h files) and implementation (and .c files) and then including only .h files has several advantages :
Compile time. Your declaration usually changes less than your implementation. If you include only .h files, and makes a change in your implementation (in the .c file), then you only have to recompile one .c file, instead of all the files which include the modified file.
Readability and management of interfaces. All your declarations can be checked in a single glance at the (usually) small .h file whereas the .c file is filled with lines and lines of code. Moreover it helps you determine which file see which functions and variables. For example, to avoid having a global variable included where you don't want it.
It's a common expectation that the compiler should compile .c files. .h files are not directly given to the compiler. They are usually only included within .c files.
Thus, your code is expected to compile by something like:
gcc main.c file.c
rather than only gcc main.c. That command would fail in the linking stage as it sees duplicate symbols.
You're going to have problems if you include file.c in more than one source code file which combine to make a library/executable, since you'll have duplicate method implementations. The above strikes me as a poor means of sharing/reusing code, and is not to be recommended.
It is not uncommon to include data in another file if it is more convenient to separate it from the code. For example, XPM or raw BMP data in a char array could be included to embed an image in the program in this way. If the data is generated from another build step then it makes sense to include the file.
I would suggest using a different file extension to avoid confusion (e.g. *.inc, *.dat, etc.).
Yes, this is permitted.
Using this is an advanced topic.
It slows down development compile time (cheaper to compile only what is necessary).
It speeds up deployment compile time (all files are out of date).
It allows the compiler to inline functions across module boundaries.
It allows a trick to control exported symbols from a library while keeping it modular.
It might confuse the debugger.
There is nothing wrong from a C language perspective with including a .c file in a program. The C language cares not about the extension of the file, C++ in fact often omits an extension on certain header files to avoid conflicts.
But from the perspective of a programmer, yeah this is odd. Most programmers will operate on the assumption that a .c file will not be included and this can cause problems via incorrect assumptions. It's best to avoid this practice. If you find you must use it, it's a sign of poor design.
In the .h files you should place function prototypes. For example, in your code you should have:
//file.h
void print(void);
//file.c
void
print(void)
{
printf("Hello world\n");
}
//file main.c
#include <stdio.h>
#include "file.h"
int main(int argc, char *argv[]){
print();
return EXIT_SUCCESS;
}
There are only two reasons that I know of for including C files (and which make sense):
inline functions which are non trivial, but that's really a matter of style
share implementation of private (static) functions by including the same file in several other files. That's actually the only way to do it in a purely platform independent way (but toolchain specific tricks like hidden attribute for gcc, etc... are much better if they are available)
The flaws:
you compile several times the same code
if not used sparingly, it quickly leads to multiple defined symbols for public symbols, in a way which is difficult to debug (include files which include other files...)
It is bad style, but one other reason to do it is that it can be used a part of a trick using the ## token concatenation operator to do a kind of poor man's templating in C.
Note that this fairly evil, isn't recommended, and will produce code that's hard to debug and maintain, but you can do something like the following:
mytemplate.c:
MyTemplateFunction_ ## MYTYPE(MYTYPE x)
{
// function code that works with all used TYPEs
}
main.c:
#define MYTYPE float
#include "mytemplate.c"
#undef MYTYPE
#define MYTYPE int
#include "mytemplate.c"
#undef MYTYPE
int main(int, char*)
{
float f;
int i;
MyTemplateFunction_float(f);
MyTemplateFunction_int(i);
return 0;
}
The evils of macros can be exacerbated:
file1.c
#define bottom arse
file2.c
int f()
{
int arse = 4;
bottom = 3;
printf("%d", arse);
}
main.c
#include "file1.c"
#include "file2.c"
void main()
{
f();
}
Indeed a convoluted example. But usually you wouldn't notice it because a macro's scope is the file it is in.
I did actually get this bug, I was importing some lib code into a new project and couldn't be bothered to write the Makefile, so I just generated an all.cpp which included all the sources from the library. And it didn't work as expected due to macro pollution. Took me a while to figure it out.
It's fine for the programs of this length.

Resources