I am programming an AVR microcontroller, and in the programmers notepad in the WINAVR Suite.
I am trying to seperate my code, however the sepeaet .c file I am unable to use AVR pre-defined variables. (the variables AVR supplies to point to certain BITs)
for example,
this code will work in my main.c file. but not in another random.c file:
UBRR0H = (unsigned char)(ubrr>>8);
it gives the error :
random.c:6: error: 'UBRR0H' undeclared (first use in this function)
in my main.c file it only has the following includes:
#include <stdio.h>
#include <stdlib.h>
#include <util/delay.h>
#include <string.h>
#include <avr/interrupt.h>
#include "lcd.h"
#include "random.h"
You have to include avr/io.h in your projet and also specify the mcu in the gcc compiler command line with -mmcu= option.
You have to create file like yours.h, where you put or your function definitions and macros definitions:
#define UBRR0H (unsigned char)(ubrr>>8);
int mine_function( char, char, int);
...
extern int global_variable;
not sure whether UBRR0H is macro or extranal variable
In addition use something about extern variables and some articles about how to use them.
And than in every your .c file you should:
#include "yours.h"
If you get in troubles because you'll end up with many .h files and you'll be including the same thing multiple times (will cause error, previously defined there), there easy hack, in yours.h:
#ifndef _H_YOURS_INCLUDED_
#define _H_YOURS_INCLUDED_ 1
// Your real content
#endif /* _H_YOURS_INCLUDED_ */
If you're using "library" definitions in any compilation unit (.c file), you'll need to include the right headers in that unit (file). I'm guessing you're missing #include or such in the random.c file. Having it just in main.c won't help the compiler while its compiling random.c. :)
(The linker is a different matter.)
One way to find out where a definition is is to simply grep the compiler and libc source (include) directories and look for the name. That won't necessarily tell you what you're meant to do to get it. I suspect that that one is a chip-specific register name that appears in the include file for your specific chip and gets loaded while going through io.h depending on the compiler switches.
If it goes missing while using a different chip, check the datasheet to make sure the register/peripheral exists in your particular chip and check the include files for the exact spelling. There may be differences.
Related
There are two head files _stub_defs.h
///stub code
#pragma once
#include "random.h"
#include <stdarg.h>
and stasrg.h
#ifndef __GNUC_VA_LIST
#define __GNUC_VA_LIST
typedef __builtin_va_list __gnuc_va_list;
#endif
When I use my cross-compiler(sparc-rtems-gcc) to compile, the two head files both are included.Then the terminal tells me:
warning: #pragma once is obsolete
stdarg.h: conflicting types for `__gnuc_va_list'
stdarg.h: previous declaration of `__gnuc_va_list'
Obviously, #include guards does not work.Is this the problem of head files' codes or the problem of my cross-compiler?
The include guards work. You have another problem.
The best way to debug this is to run only the C preprocessor. For gcc (including cross compiler gcc), you can use the -E option. Just add this to your compile stage. Instead of getting an object file, you will get a C file after the preprocessor stage.
Take that file, and search for the duplicate definition there. The file will also have markers that tell the compiler which file this definition originally came from, as well as markers when includes are nested. If you follow those, you will see both where the two definitions come from and which file included each of them.
For example I have a can.h header file that contains
#ifdef MC
extern int speed;
// many more variables and function prototypes
#endif
EDIT: I would want an option for the user to define MC that enables such variable and function prototypes. It is to be defined in C main alongside include directives (e.g. #define MC) however, other header files cannot link to main.c. Instead, i ended up defining such macros in can.h header itself. All i can think is writing a main.h too where can.h will include main.h. are there other ways to go around this problem?
It's not entirely clear to me what you are trying to do, but perhaps you want this:
/* in file can.h */
extern int speed;
and then
/* in file main.c */
#include "can.h"
int speed;
The header can.h just declares the existence of speed so that other modules can refer to it as an external symbol. The storage for this object is then allocated in main.c, when you write the the definition of speed.
You can either create a configuration header that has all the macros that you want defined and #include it in every header, or you can define configuration macros on the command line.
Use #include - Then you get those bits that you require
Inclusion is simply file content insertion by the preprocessor, anything declared or defined prior to inclusion will be defined in the include file. So:
#define MC
#include "can.h"
While that achieves what you have asked, I would not necessarily recommend it however, and might question the design.
You can have such macro added at your project level.
Such macro are also known as Preprocessor definitions. Visual studio provides such definition in project settings/configurations..
So you can have it added there and compile the solution for including mentioned declarations or remove it if you don't want to include it.
Adding the screen shot.
Alright, I understand how the extern definition works but I don't know what would be the "best" place to put them. Consider the following file structure:
main.c / main.h / global.h
drv_adc.c / drv_adc.h
drv_pwm.c / drv_pwm.h
You might guess, this is quite common for a small microcontroller. The two drivers work on different parts of the hardware and have no interdependencies. Both drivers are able to set a flag (say: adc_irq_occured, pwm_irq_occured) which indicates something has happened and which will be handled in the main.c.
Now I can think of two approaches where I would put the "extern bool adc_irq_occured;" flag.
The drv_adc.h: It somewhat belongs to the ADC driver, therefore I could add it to its header file and instantiate it in the main.c.
I turn the logic around and place the extern declaration into my main.h (or global.h if it has to be so) and instantiate it in my drv_adc.c.
Now the question: Which option is the preferred option here? Is there any good book where I could read about such topics?
In main.c:
int flag = 0;
In main.h:
extern int myGlobal;
In drv_adc.c:
#include "main.h"
In drv_pwm.c:
#include "main.h"
Now that the variable is global , it is less secure , so use it with caution , and make sure any other drv files won't tamper with it.
-- EDIT --
Why not the other way round ?
We put the extern declaration in the header which is to be included by your other files. This is because we declare it once , telling the compiler that only one common version of the flag variable is available to both the drv files , which we included in the header , for more clarity do read this discussion , Difference between putting variables in header vs putting variables in source .
I have an ANSI C program comprising two files. The first file contains the main() function, and the second file contains other functions that the first file calls. Before the main() function definition, I've placed the following code:
#define PI 3.14159265358979323846
but the 2nd file didn't see this variable. The first file sees it fine. Then, I placed this same line in the second file (while keeping it in the first file as above), before the function definitions, but still the second file doesn't see it. Things always compile fine, but when tracing the variable PI in gdb, it shows "No symbol "PI" in current context."
How to make PI a global constant viewable across all files compiled in the application?
EDIT / UPDATE:
Based on the response so far, I've created the following file:
myheader.h
#ifndef my_header_stuff
#define my_header_stuff
#define PI 3.1415926535897932384626433832795
#endif
and in the two files I want to see this constant PI, I've included this file as follows:
file1.c
#include <stdio.h>
#include <stdlib.h>
#include "myheader.h"
int main(void) {
etc...
}
and file2.c
#include <stdio.h>
#include <stdlib.h>
#include "myheader.h"
double interesting_function(void) {
etc...
}
Questions:
When I use GDB to debug, b PI returns (in both files, same result) "No symbol "PI" in current context". However, the math depending on PI is computed correctly. Is there a way to view PI in gdb?
Can I also include the two lines for stdio and stdlib in the myheader.h file?
Can I also include any function prototypes in the myheader.h file? If I do, and then let's say I create a file3.c that doesn't require any of these prototypes because it doesn't use those functions, is any harm done?
It's normal that a macro definition doesn't show up in GDB. It's because GCC replaces each PI with the actual pi value.
Yes. In your case, it doesn't hurt. If the .h file contains a prototype(they are typically called declarations) that uses a type that are defined in somewhere else, you can simply #include in your header files
Yes. You can declare what ever function prototype you want and it won't do any harm in terms of the functionality of your program. Compiler won't even check the existence of the definitions of those functions unless you actually call them from somewhere else.
A macro isn't a variable, and it won't appear in gdb as a variable. Things are compiling fine because they're working fine; you just have the wrong expectations for gdb.
In general, a macro that's needed in more than one place should be defined in a header file that's included everywhere it's needed; for instance, the macro M_PI is already defined to be pi in the standard include math.h.
The other thing that you can do is to have const int PI = 3.14159etcetc; in one file and extern const int PI; in all of the others, but this is a pain because it requires the definition of the value to exist in exactly one compilation unit.
I'd declare PI in the header file and include this header file to all source files
You should place function declarations, macros in your header file (and maybe some simple inline functions)
Implementation should be placed in .c files
// I edited answer to response to your comment
You could create a global variable.
//file1.c
int my_global;
//file2.c
extern int my_global;
//code using my_global...
Note that usage of preprocessor is mostly discouraged.
Also note that M_PI define suggested by others is not available in the latest C standard (C99) so you cannot rely on it.
If you are working only with standard math definitions, the standard header file math.h has some good ones:
#include <math.h>
...
double area = radius * radius * M_PI;
For defining global constants, there are many methods, adequately explained in other answers.
It would be very nice if you create a header file for the file that's included in the main source file, and It is a good pratice too and you will improve your C Language skills. Imagine that we have a source file with a functions, global variables and constants that we will need in the main source file, we could create a header file for that file and put those functions there:
#ifndef _FILE_H
#define _FILE_H
#include "file.h"
#define PI 3.14
/**
*functions protypes here or other constants or globlal variables
*/
void do_something(char **param1, int *param2);
#endif
In your "file.c" file you should do the folowing:
#inlcude "file.h"
void do_something(char **param1, int *param2)
{
/**
*do something
*/
}
In your "main.c" file you should do :
#include "file.h"
/**
*Now you can use "PI"
*/
I hope I helped somehow.
the preprocessor direct will replace by actual code before compile, so in your case your first file's PI will replace by 3.14... and there is not PI symbol in symbol table of first translate unit. so the gdb will say no 'PI' symbol.
if you want use #define and don't want use header file, you need #deine PI at the beginning of each file.
Should header files have #includes?
I'm generally of the opinion that this kind of hierarchical include is bad. Say you have this:
foo.h:
#include <stdio.h> // we use something from this library here
struct foo { ... } foo;
main.c
#include "foo.h"
/* use foo for something */
printf(...)
The day main.c's implementation changes, and you no longer use foo.h, the compilation will break and you must add <stdio.h> by hand.
Versus having this:
foo.h
// Warning! we depend on stdio.h
struct foo {...
main.c
#include <stdio.h> //required for foo.h, also for other stuff
#include "foo.h"
And when you stop using foo, removing it breaks nothing, but removing stdio.h will break foo.h.
Should #includes be banned from .h files?
You've outlined the two main philosophies on this subject.
My own opinion (and I think that's all that one can really have on this) is that headers should as self-contained as possible. I don't want to have to know all the dependencies of foo.h just to be able to use that header. I also despise having to include headers in a particular order.
However, the developer of foo.h should also take responsibility for making it as dependency-free as possible. For example, the foo.h header should be written to be free of a dependency on stdio.h if that's at all possible (using forward declarations can help with that).
Note that the C standard forbids a standard header from including another standard header, but the C++ standard doesn't. So you can see the problem you describe when moving from one C++ compiler version to another. For example, in MSVC, including <vector> used to bring in <iterator>, but that no longer occurs in MSVC 2010, so code that compiled before might not any more becuase you may need to specifically include <iterator>.
However, even though the C standard might seem to advocate the second philosophy, note that it also mandates that no header depend on another and that you can include headers in any order. So you get the best of both worlds, but at a cost of complexity to the implementers of the C library. They have to jump through some hoops to do this (particularly to support definitions that can be brought in through any of several headers, like NULL or size_t). I guess that the people who drafted the C++ standard decided adding that complexity to impersonators was no longer reasonable (I don't know to what degree C++ library implementors take advantage of the 'loophole' - it looks like MS might be tightening this up, even if it's not technically required).
My general recommendations are:
A file should #include what it needs.
It should not expect something else to #include something it needs.
It should not #include something it doesn't need because something else might want it.
The real test is this: you should be able to compile a source file consisting of any single #include and get no errors or warnings beyond "There is no main()". If you pass this test, then you can expect anything else to be able to #include your file with no problems. I've written a short script called "hcheck" which I use to test this:
#!/usr/bin/env bash
# hcheck: Check header file syntax (works on source files, too...)
if [ $# -eq 0 ]; then
echo "Usage: $0 <filename>"
exit 1
fi
for f in "$#" ; do
case $f in
*.c | *.cpp | *.cc | *.h | *.hh | *.hpp )
echo "#include \"$f\"" > hcheck.cc
printf "\n\033[4mChecking $f\033[0m\n"
make -s $hcheck.o
rm -f hcheck.o hcheck.cc
;;
esac
done
I'm sure there are several things that this script could do better, but it should be a good starting point.
If this is too much, and if your header files almost always have corresponding source files, then another technique is to require that the associated header be the first #include in the source file. For example:
Foo.h:
#ifndef Foo_h
#define Foo_h
/* #includes that Foo.h needs go here. */
/* Other header declarations here */
#endif
Foo.c:
#include "Foo.h"
/* other #includes that Foo.c needs go here. */
/* source code here */
This also shows the "include guards" in Foo.h that others mentioned.
By putting #include "Foo.h" first, Foo.h must #include its dependencies, otherwise you'll get a compile error.
Well, main shouldn't rely on "foo.h" in the first place for stdio. There's no harm in including something twice.
Also, perhaps foo.h doesn't really need stdio. What's more likely is that foo.c (the implementation) needs stdio.
Long story short, I think everyone should just include whatever they need and rely on include guards.
Once you get into projects with hundreds or thousands of header files, this gets untenable. Say I have a header file called "MyCoolFunction.h" that contains the prototype for MyCoolFunction(), and that function takes pointers to structs as parameters. I should be able to assume that including MyCoolFunction.h will include everything that's necessary and allow me to use that function without looking in the .h file to see what else I need to include.
If the header file needs a specific header, add it to the header file
#ifndef HEADER_GUARD_YOUR_STYLE
#define HEADER_GUARD_YOUR_STYLE
#include <stdio.h> /* FILE */
int foo(FILE *);
#endif /* HEADER GUARD */
if the code file doesn't need a header, don't add it
/* #include <stdio.h> */ /* removed because unneeded */
#include <stddef.h> /* NULL */
#include "header.h"
int main(void) {
foo(NULL);
return 0;
}
Why don't you #include stuff in the *.c file corresponding to the header?