I have a master.c and slave.c file. Both are compiled seperately, producing different contiki-ng modules.
As they both use a similar function, I created a helpers.h and helpers.c file, and added #include "helpers.h" to main.c and slave.c to deduplicate this function.
Additionally, I need to use the LOG_INFO function from log.h for logging in both slave.c/master.c and in helpers.c. log.h requires the macro LOG_MODULE to be set to the name of the Module (Master or Slave).
master.c:
#define LOG_MODULE "Master"
#include "sys/log.h"
#include "helpers.h"
void main() {
do_something();
}
slave.c:
#define LOG_MODULE "Slave"
#include "sys/log.h"
#include "helpers.h"
void main() {
do_something();
}
helpers.h:
void do_something();
helpers.c:
#include "sys/log.h"
#include "helpers.h"
void do_something() {
...
LOG_INFO("Result: x");
...
}
My problem is the following:
Even though I include "helpers.h" AFTER defining LOG_MODULE in master.c/slave.c, I still get a complaint about LOG_MODULE not being defined when I use LOG_INFO in helpers.c.
What is a proper and elegant solution to define LOG_MODULE with their respective values in slave.c and master.c, and have that definition used by helpers.c?
The helpers.c is a separate compile unit and during the compilation of it, it only "sees" the #define's in the headers included only in that file.
To achieve what you what you want, you would need to:
Move the definition of all the functions that use logging from .c to .h file.
Make the logging functions inline in this header to prevent linking problems if master.c and slave.c are then linked together to create a single binary.
helpers.h
#include "sys/log.h"
inline void do_something() {
...
LOG_INFO("Result: x");
...
}
Related
What are advantages and disadvantages of both approaches?
Source vs. header implementation
Function definition inside source file
Header file sourcefunction.h contains declaration only.
#ifndef SOURCEFUNCTION_H
#define SOURCEFUNCTION_H
void sourcefunction(void);
#endif // SOURCEFUNCTION_H
Source file sourcefunction.c contains definition
#include "sourcefunction.h"
#include <stdio.h>
void sourcefunction(void) { printf(" My body is in a source file\n"); }
Function definition inside header file
Header file headerfunction.h contains definition which is the declaration at the same time.
#ifndef HEADERFUNCTION_H
#define HEADERFUNCTION_H
#include <stdio.h>
void headerfunction(void) { printf(" My body is in a header file\n"); }
#endif // HEADERFUNCTION_H
No source file is needed.
Consumer
File main.c
#include "sourcefunction.h"
#include "headerfunction.h"
int main(void) {
sourcefunction();
headerfunction();
return 0;
}
Why compile many source files?
We have to compile all source files and remember about them during linking.
gcc -c sourcefunction.c
gcc -c main.c
gcc main.o sourcefunction.o
Make can handle file managing but why even bother?
Is separation of interface and implementation always an issue?
It is obvious reason for big projects and teamwork. The designer specifies the interface. The programmers implement functionality.
What about smaller projects and non-formal approach?
Is removing definition from header files always preventing from linker errors?
Let's assume my program is using another module that defines the function with the same name sourcefunction().
#include "sourcefunction.h"
#include "sourcefunction1.h"
#include "headerfunction.h"
int main(void) {
headerfunction();
sourcefunction();
return 0;
}
Different function interface
File sourcefunction1.h
#ifndef SOURCEFUNCTION1_H
#define SOURCEFUNCTION1_H
int sourcefunction(void);
#endif // SOURCEFUNCTION1_H
File sourcefunction1.c
#include "sourcefunction1.h"
#include <stdio.h>
int sourcefunction(void) { int a = 5; return a; }
By compiling main.c, I get a nice compiler error
sourcefunction1.h:4:5: error: conflicting types for 'sourcefunction'
showing me the location of error.
Same function interface
File sourcefunction1.h
#ifndef SOURCEFUNCTION1_H
#define SOURCEFUNCTION1_H
void sourcefunction(void);
#endif // SOURCEFUNCTION1_H
File sourcefunction1.c
#include "sourcefunction1.h"
#include <stdio.h>
void sourcefunction(void) { int a = 5; printf("%d",a); }
Compiler does not mind multiple declarations. I get ugly linker error.
Can header implementation serve as library?
jschultz410 says
If you are writing a library and all your function definitions are in headers, then other people who do segment their development into multiple translation units will get multiple definitions of your functions if they are needed in multiple translation units
Lets' have
File consumer1.c
#include "headerfunction.h"
void consume1(void) { headerfunction(); }
File consumer2.c
#include "headerfunction.h"
void consume2(void) { headerfunction(); headerfunction();}
File twoConsumers.c
extern void consume1(void);
extern void consume2(void);
int main(void) {
consume1();
consume2();
return 0;
}
Let's compile sources.
gcc -c consumer1.c
gcc -c consumer2.c
gcc -c twoConsumers.c
So far, so good. Now, linking.
gcc consumer1.o consumer2.o twoConsumers.o
Linker error: multiple definition of 'headerfunction', of course.
But I can make my library function static.
File headerfunction.h, afterwards.
#ifndef HEADERFUNCTION_H
#define HEADERFUNCTION_H
#include <stdio.h>
static void headerfunction(void) { printf(" My body is in a header file\n"); }
#endif // HEADERFUNCTION_H
It hides the definition from other translation units.
I shouldn't answer this, but I will.
This can create duplicate definitions unless you really only have a single .c file in your project (unwise). Even the header guards won't prevent files the headers from being included multiple times if those multiple times are with different .c files. When the .obj files are linked together, there will be conflicts.
If only the function declaration and not definition is in the header, then only changes to the interface (the function name, parameters or return type) require recompiling dependencies. However, if the entire definition is in the header, then any change to the function requires recompiling all .c and .h files that depend on it, which, in a larger project, can create a lot of unnecessary recompiling.
It's not the convention. Libraries will not use this convention, so you'll be stuck dealing with their header file structure. Other developers will not use this convention, so you can create confusion or annoyance there.
In the K&R book (p59) (edit: second edition, covering ANSI C), it is suggested that it is easier to split larger projects into multiple files. In each file, several libraries are included at the top as usual: e.g. getop.c needs stdio.h, and so does stack.c and so does main.c.
The snippets are something like this:
//main.c
#include <stdio.h>
#include <stdlib.h>
#include "calc.h"
int main(void)
{
//etc
}
//getop.c
#include <stdio.h>
#include <ctype.h>
#include "calc.h"
getop()
{
/*code*/
}
//stack.c
#include <stdio.h>
#include "calc.h"
void push(double val)
{
//code
}
I am having trouble figuring out how including the standard libraries several times in a project works. Of course, for the custom .c files to be able to access built in functions, we need to include #include <header.h> so that they are aware of the existence of printf() and getchar() and so on, but wouldn't this approach increase the size of the final program if stdio.h is included four times instead of once ( if everything was placed in one file)?
K&R does point out that splitting a program over several files eventually makes it more difficult to maintain all the .h files.
I suppose what I am really asking is how does the compiler figure out the problem of one library being #included several times in a project.
I have read up on using include guards, but it seems that is not needed for this implementation, as they deal with ensuring that identical bits of code aren't included twice, as in:
File "module.h"
#ifndef MODULE_H
#define MODULE_H
struct foo {
int member;
};
#endif /* MODULE_H */
File "mod2.h"
#include "module.h"
File "prog.c"
#include "module.h"
#include "mod2.h"
refs
I suppose what I am really asking is how does the compiler figure out the problem of one library being #included several times in a project.
you don't include a library by #include <stdio.h>, you just include it's declarations, so the compiler knows what functions exists. The linker takes care of including a library and putting everything together.
Because they use something called include guards, suppose your own include files where to be included more than once, then you can do this
MyHeader.h
#ifndef MY_HEADER_H
#define MY_HEADER_H
/* file content goes here */
#endif /* MY_HEADER_H */
Then you have another header
**AnotherHeader.h**
#ifndef MY_ANOTHER_HEADER_H
#define MY_ANOTHER_HEADER_H
#include "MyHeader.h"
/* file content goes here */
#endif /* MY_ANOTHER_HEADER_H */
and in your program
main.c
/*
* MY_HEADER_H is undefined so it will be defined and MyHeader.h contents
* will be included.
*/
#include "MyHeader.h"
/*
* MY_HEADER_H is defined now, so MyHeader.h contents will not be included
*/
#include "AnotherHeader.h"
int main()
{
return 0;
}
Since the included files are only included once per compilation unit the resulting binary size will not increase, besides the inclusion of header files only increases the compiled file size when for example there are string literals declared in those headers, otherwise they only provide information to the compiler about how to call a given function, i.e. how to pass parameters to it and how to store it's returned value.
This is my code. I have file1.c and file2.c. I want to call the MESSAGE from file2.c but I can't seem to do it. I am newbie in C so I really don't know what to do. I researched already but, I can't seem to find a specific answer. Thankyou.
#define MESSAGE "this is message!"
helloworld(){
printf("%s",MESSAGE);
getch();
}
#include <stdio.h>
#include <stdlib.h>
#include <conio.h>
#include "file2.c"
int main(void)
{
helloworld();
}
There are a few misconceptions you have: First of all the concept of "calling" a macro. It's not possible, even if a macro looks like a function it's not a function and macros are not actually handled by the compiler. Instead macros are part of a separate language that is handled by a preprocessor, which takes the source file and modifies it to generate a translation unit that the compiler sees. (For more information about the difference phases of "compilation" see e.g. this reference.)
The preprocessor does this by basically doing a search-replace in the input source file: When it sees a macro "invocation" it simply replaces that with the "body" of the macro. When it sees an #include directive, it preprocesses the file and then puts the content in place of the directive.
So in your code, when the preprocessor sees the macro MESSAGE it is literally replaced by "this is message!". The actual compiler doesn't see MESSAGE at all, it only sees the string literal.
Another misconception is how you use the #include directive. You should not use it to include source files. Instead you compile the source files separately (which creates object files) and then link the generated object files together with whatever libraries are needed to form the final executable.
To solve the problem of macros (and other declarations) being available to all source files, you use header files. These are like source files, but only contains declarations and macros. You then include the header file in both source files, and both source files will know about the declarations and macros available in the header file.
So in your case you should have three files: The main source file, the source file containing the function, and a header file containing the macro and the function declaration (also known as a prototype). Something like
Header file, e.g. header.h:
// First an include guard (see e.g. https://en.wikipedia.org/wiki/Include_guard)
#ifndef HEADER_H
#define HEADER_H
// Define the macro, if it needs to be used by all source files
// including this header file
#define MESSAGE "this is message!"
// Declare a function prototype so it can be used from other
// source files
void helloworld();
#endif
Main source file, e.g. main.c:
// Include a system header file, to be able to use the `printf` function
#include <stdio.h>
// Include the header file containing common macros and declarations
#include "header.h"
int main(void)
{
// Use the macro
printf("From main, MESSAGE = %s\n", MESSAGE);
// Call the function from the other file
helloworld();
}
The other file, e.g. hello.c:
// Include a system header file, to be able to use the `printf` function
#include <stdio.h>
// Include the header file containing common macros and declarations
#include "header.h"
void helloworld(void)
{
printf("Hello world!\n");
printf("From helloworld, MESSAGE = %s\n", MESSAGE);
}
Now, if you use a command-line compiler like gcc or clang then you can simply build it all by doing e.g.
$ gcc -Wall main.c hello.c -o myhello
That command will take the two source files, main.c and hello.c and run the preprocessor and compiler on them to generate (temporary) object files. These object files are then linked together with the standard C library to form the program myhello (that's what the option -o does, names the output file).
You can then run myhello:
$ ./myhello
From main, MESSAGE = this is message!
Hello world!
From helloworld, MESSAGE = this is message!
In your file1.c, MESSAGE is a preprocessor macro, which means the text MESSAGE will be replaced with the string "this is message!". It is not visible outside the file. This is because in C, translation units are the final inputs to the compiler, and thes translation units already have all of preprocessor macros replaced by the tokens of the corresponding argument.
If you want to have a common variable, you should declare the variable as extern in a .h header file, and then #include the file where you need to use it.
see Compiling multiple C files in a program
You have to put your #define in a .h file and include it in .c files where you want to use it.
You can write the files as below and compile the code as i mention in the following steps.
file1.h
#ifndef _FILE1_H
#define _FILE1_H
#define MESSAGE "this is message!"
extern void helloworld();
#endif
file1.c
#include "file1.h"
helloworld()
{
printf("%s",MESSAGE);
getch();
}
file2.c
#include <stdio.h>
#include <stdlib.h>
#include <conio.h>
#include "file1.h"
int main(void)
{
helloworld();
return 0;
}
For compiling,
gcc -Wall file1.c file2.c -o myprog
./myprog
Here is code try this:
In File1.C
#define FILE1_C
#include "file1.h"
helloworld()
{
printf("%s",MESSAGE);
getch();
}
In File2.C
#include <stdio.h>
#include <stdlib.h>
#include <conio.h>
#include "file1.h"
int main(void)
{
helloworld();
}
In File1.h
#ifdef FILE1_C
#define MESSAGE "this is message!"
#define EXTERN
#else
#define EXTERN extern
#endif
EXTERN helloword()
So I'm still getting used to modular programming, and want to make sure I'm adhering to best practices. If I have the two module header files below, will the the headers #included by each file (for example "mpi.h") be included multiple times? Is there a proper way to account for this?
Also, my module headers typically look like these examples, so any other criticism/pointers would be helpful.
/* foo.h */
#ifndef FOO_H
#define FOO_H
#include <stdlib.h>
#include "mpi.h"
void foo();
#endif
and
/* bar.h */
#ifndef BAR_H
#define BAR_H
#include <stdlib.h>
#include "mpi.h"
void bar();
#endif
And use the sample program:
/* ExampleClient.c */
#include <stdlib.h>
#include <stdio.h>
#include "mpi.h"
#include "foo.h"
#include "bar.h"
void main(int argc, char *argv[]) {
foo();
MPI_Func();
bar();
exit(0)
}
What do you mean by 'include'? The preprocessor statement #include file copies the contents of file and replaces the statement with these contents. This happens no matter
If by 'include' you mean "the statements and symbols in these files will be parsed multiple times causing warnings and errors", then no, the include guards will prevent that.
If by 'include' you mean "some part of compiler will read some part of these files", then yes, they'll be included multiple times. The preprocessor will read the second inclusion of the file and replace it with a blank line because of the include guards, which incurs a tiny overhead (the file is already in memory). Modern compilers (GCC, not sure about others) will probably be optimized to avoid this, however, and note that the file has include guards on the first pass and simply discard future inclusions, removing the overhead - Don't worry about speed here, clarity and modularity are more important. Compilation is a time-consuming process, for sure, but #include is the least of your worries.
To better understand include guards, consider the following code sample:
#ifndef INCLUDE_GUARD
#define INCLUDE_GUARD
// Define to 1 in first block
#define GUARDED 1
#endif
#ifndef INCLUDE_GUARD
#define INCLUDE_GUARD
// Redefine to 2 in second block
#define GUARDED 2
#endif
After (the first pass of) preprocessing, what will GUARDED be defined to? The preprocessor statement #ifndef or its equivalent, #if !defined() will return false if their argument is indeed defined. Therefore, we can conclude that the second #ifndef will return false, so only the first definition of GUARDED will remain after the first pass of the preprocessor. Any instance of GUARDED remaining in the program will be replaced by 1 on the next pass.
In your example, you've got something slightly (but not much) more complicated. Expanding all the #include statements in ExampleClient.c will result in the following source: (Note: I indented it, but that's not normal style for headers and the preprocessor won't do it. I just wanted to make it more readable)
/* ExampleClient.c */
//#include <stdlib.h>
#ifndef STDLIB_H
#define STDLIB_H
int abs (int number); //etc.
#endif
//#include <stdio.h>
#ifndef STDLIB_H
#define STDLIB_H
#define NULL 0 //etc.
#endif
//#include "mpi.h"
#ifndef MPI_H
#define MPI_H
void MPI_Func(void);
#endif
//#include "foo.h"
#ifndef FOO_H
#define FOO_H
//#include <stdlib.h>
#ifndef STDLIB_H
#define STDLIB_H
int abs (int number); //etc.
#endif
//#include "mpi.h"
#ifndef MPI_H
#define MPI_H
void MPI_Func(void);
#endif
void foo(void);
#endif
//#include "bar.h"
#ifndef BAR_H
#define BAR_H
//#include <stdlib.h>
#ifndef STDLIB_H
#define STDLIB_H
int abs (int number); //etc.
#endif
//#include "mpi.h"
#ifndef MPI_H
#define MPI_H
void MPI_Func(void);
#endif
void bar(void);
#endif
void main(int argc, char *argv[]) {
foo();
MPI_Func();
bar();
exit(0); // Added missing semicolon
}
Go through that code and note when various definitions are performed. The result is:
#define STDLIB_H
int abs (int number); //etc.
#define STDLIB_H
#define NULL 0 //etc.
#define MPI_H
void MPI_Func(void);
#define FOO_H
void foo(void);
#define BAR_H
void bar(void);
With respect to your request for other criticism/pointers, why are you #including stdlib.h and mpi.h in all your headers? I understand that this is a stripped down example, but in general, header files should only include files necessary for the declaration of their contents. If you use a function from stdlib or call MPI_func() in foo.c or bar.c, but the function declarations are simply void foo(void), you shouldn't include these files in the header function. For example, consider the following module:
foo.h:
#ifndef FOO_H
#define FOO_H
void foo(void);
#endif
foo.c:
#include <stdlib.h> // Defines type size_t
#include "mpi.h" // Declares function MPI_func()
#include "foo.h" // Include self so type definitions and function declarations
// in foo.h are available to all functions in foo.c
void foo(void);
size_t length;
char msg[] = "Message";
MPI_func(msg, length);
}
In this example, the implementation of foo() requires stuff from stdlib and mpi, but the definition does not. If foo() returned or required a size_t value (typedef'ed in stdlib), you'd need to #include stdlib in the .h file.
Mostly no, with a bit 'yes'. Your header files will be 'read' more than once but at second and later time the preprocessor will cut off all the contents. This implies that it won't waste your compiler's time and also #includes inside your #ifdef blocks will be done only once (per header file).
It's a good practice. Myself, I also add the following line before #ifdefs:
#pragma once
When supported by the particular compiler, it guarantees that the file will actually be read only once. I think it's a little bit more optimal that way.
So, to sum up:
header guards like you are using prevent compiler from interpreting the header contents more than once but possibly can cause the preprocessor to parse it more than once (which is not a big problem),
#pragma once causes the particular header file to be read only once.
When using both, #pragma once should be in effect if supported by compiler; if not, header guards will apply.
1) GOOD: you have an "include guard". "stdlib.h", "mpi.h" and "void foo()" are only seen by the compiler the first time you #include "foo.h"
/* foo.h */
#ifndef FOO_H
#define FOO_H
#include <stdlib.h>
#include "mpi.h"
void foo();
#endif
2) BAD: This will #include the entire contents of "foo.h" every time you use it:
/* foo.h */
#include <stdlib.h>
#include "mpi.h"
void foo();
3) By #include", I mean "once per compilation unit" (i.e. the same .c source file).
This mainly "protects" against a header (foo.h) calling another header ("bar.h) which might recursively call the first header.
Every different compilation unit that #includes foo.h will always get "stdlib.h", "mpi.h" and "void foo()". The point is that they'll be seen only once - not multiple times in the same compilation unit.
4) This is all "compile-time". It has nothing to do with libraries (which are "link time").
Yes, mpi.h will be included multiple times (as will stdlib.h); if mpi.h has include guards similar to foo.h and bar.h, then it won't be an issue.
I'm trying to understand how global variables and functions work in C. My program compiles and works fine with gcc, but does not compile with g++. I have the following files:
globals.h:
int i;
void fun();
globals.c:
#include "stdlib.h"
#include "stdio.h"
void fun()
{
printf("global function\n");
}
main.c:
#include "stdlib.h"
#include "stdio.h"
#include "globals.h"
void myfun();
int main()
{
i=1;
myfun();
return 0;
}
And finally, myfun.c:
#include "stdlib.h"
#include "stdio.h"
#include "globals.h"
void myfun()
{
fun();
}
I get the following error when compiling with g++:
/tmp/ccoZxBg9.o:(.bss+0x0): multiple definition of `i'
/tmp/ccz8cPTA.o:(.bss+0x0): first defined here
collect2: ld returned 1 exit status
Any ideas why? I would prefer to compile with g++.
Every file you include globals.h from will define "int i".
Instead, put "extern int i;" into the header file and then put the actual definition of "int i = 1;" in globals.c.
Putting header guards around globals.h would be sensible too.
Edit: In answer to your question its because a #include works kind of like a cut and paste. It pastes the contents of the included file into the c file that you are calling include from. As you include "globals.h" from main.c and myfun.c you define int i = 1 in both files. This value, being global, gets put into the table of linkable values. If you have the same variable name twice then the linker won't be able to tell which one it needs and you get the error you are seeing. Instead by adding extern on the front in the header file you are telling each file that "int i" is defined somewhere else. Obviously, you need to define it somewhere else (and ONLY in one place) so defining it in globals.c makes perfect sense.
Hope that helps :)
I would add an include guard in your globals file
#ifndef GLOBALS_H
#define GLOBALS_H
int i;
void fun();
#endif
Edit: Change your globals to be like this (using extern as the other answer describes)
globals.h
extern int i;
extern void fun();
globals.c
#include "stdlib.h"
#include "stdio.h"
int i;
void fun()
{
printf("global function\n");
}
I compiled it with
g++ globals.c main.c myfun.c
and it ran ok
Several things wrong here; several other things highly recommended:
globals.h:
#ifndef GLOBALS_H
#define GLOBALS_H
extern int my_global;
#ifdef __cplusplus
extern "C" {
#endif
void fun();
#ifdef __cplusplus
}
#endif
#endif
/* GLOBALS_H */
globals.c:
#include <stdlib.h>
#include <stdio.h>
#include "globals.h"
int my_global;
void fun()
{
printf("global function: %d\n", my_global);
}
main.c:
#include <stdlib.h>
#include <stdio.h>
#include "globals.h"
void myfun();
int main()
{
my_global=1;
myfun();
return 0;
}
void myfun()
{
fun();
}
You should declare "extern int myvar" in your header, and actually allocate "int myvar" in one and only one .c file.
You should include "globals.h" in every file that uses "myvar" - including the file where it's allocated.
Especially if you're planning on mixing C and C++ modules, you should use 'extern "C"' to distinguish non-C++ functions.
System headers should be "#include <some_header.h>"; your own headers should use quotes (#include "myheader.h") instead.
Short variable names like "i" might be OK for a strictly local variable (like a loop index), but you should always use longer, descriptive names whenever you can't avoid using a global variable.
I added a "printf" for my_global.
'Hope that helps!
I had this problem when porting some old C code to C++. The problem was it was a project that was connected to a database, and i wanted to port the database to c++ but not the rest. The database pulled in some C dependencies that couldn't be ported, so i needed the C code that overlapped both the database and the other project to compile in g++ as well as gcc...
The solution to this problem is to define all variables as extern in the .h file. then when you compile in either gcc or g++ it will report symbols missing in the .c files. So edit the .c files in the error messages and insert the declaration into all the .c files that need the variables. Note: you may have to declare it in multiple .c files, which is what threw me and why I was stuck on this problem for ages.
Anyway this solved my problem and the code compiles cleanly under both gcc and g++ now.