Having .c source file storing .txt information at compile-time - c

I'm using C to make some RTEMS application for a given target (a LEON processor more specifically).
When doing the various tutorials I noticed that since it isn't possible to load the simulation .txt files, the solution is to have .c source files (let's call them inputs.c) keeping the various 512x512 global input matrices and have them referenced as extern within the main file.
I'm trying to find information about this procedure but I haven't found it.
My question: In the documentation of the example they state that at some point they are going to transfer the global matrices in the inputs.c from the PC to the target via UART. Isn't the inputs.c file loaded into the LEON processor as well as all the other .c files?

I think there is some information missing to completely understand which is your environment ...
But it could be that the data into the input.c is linked into a separated section (you should check the RTEMS linker file cmdlnk).
This way it won't be loaded by grmon but it will be loaded on specific command.
Or probably you do actually upload the data exactly at the same time of the executable code by doing the "load" in grmon.

Related

Why use separate source files?

I'm learning C, coming from scripted languages background it is highly intriguing and rather confusing.
A brief story of how I got to this question:
At first I was confused why I can't include a source (.c) file in another source file, then I found out that function declarations repeat. Then I found out about header files (.h) and was confused, why I have to declare a function in one file then define in another, then if something changes I have to go edit 2 files, so I started defining functions in header files. Then I found out that #ifndef doesn't work across separate source files, so here's the question I can't yet find the answer to:
Why do I even have to use separate source files? Why can't I just have 1 source file and put all of my other code/function definitions in header files, this way I'm going to have things defined once and included once in the final build?
Now don't get me wrong, I'm not thinking I'll start a revolution, I'm just looking for answers as to why this is not how it works.
If you think beyond small learning programs, there are several benefits to splitting code into multiple source files.
Code Organization
Large programming projects can have millions of lines of code. You don't want to have files that big! Editors will probably have trouble handling it. Human beings will have trouble understanding it. Multiple developers would have conflicts all touching the same file. If you separate the code by purpose, it will be much easier to handle.
Build Times
Many code changes are small, while compilation time can be expensive. Compilers typically work on a file at a time, not parts of files. So if you make a tiny change and then have to rebuild the entire project, that can be very time consuming. If your code is separated into multiple source files, making a change in one of them means you only have to recompile that file.
Reusability
Frequently, code can be reused for more than one program. If you have all your code in one source file, you'll have to go and copy that code into another file to reuse it. Of course, now if it has a bug you have two places to fix it. Not good.
Let's say, for example, you have code that uses a linked list. If you put the linked list code into its own source file, you can then simply link that into another program. If there's a bug, you can fix it in one place, recompile, and then re-link the programs that use it.
You can use a single source file for some (small) projects.
For many projects though, it makes sense to divide the source in different source files according to their function.
Let's say your making a game.
Have all the user interface code in its source file.
Have all the computer move algorithms in its source file.
...
Have the main() function which ties it all together in its source file.
Then, to compile for PC you do gcc game.c algo.c ui-pc.c, to compile to android you do gcc game.c algo.c ui-android.c ..., to compile a brand new algorithm you though up and don't know if it's good gcc game.c algo-test.c ui-pc.c
Header files help keep everything in sync. And they're a good place for documentation.

Compile binary data into C program and use them like a file

I have a C library which uses a set of binary data files (read only). One of these files, lets call it f1.dat, is used in 99% of applications which use the library, while the other 59 files f2.dat .. f60.dat are used only rarely.
I would like to compile the data of f1.dat directly into the library. The users of the library who never wish to use the data in files f2.dat .. f60.dat would not have to carry an extra data file around, the compiled library .dll or .so would work without extra resources for those users.
The most convenient solution would be if the memory area with the data could be accessed with the same function calls fseek, ftell, read as the data in a file. For the application it should make no difference whether it reads an external fle or this memory "file".
Is there a portable solution for this?

Difference between the ELF vs MAP file

The linker can output both the ELF and the MAP file. These files are especially relevant in the embedded systems world because the ELF file is usually used to read out the addresses of variables or functions. Additionally, the ELF file is used by different embedded measurement or analysis tools.
When I open a MAP file, then within it, I can see for each global variable and every external function the following information: allocated address, symbolic name, allocated bytes, memory unit, and memory section.
On the other hand, once I open the ELF file, it is a binary file and not human readable. However, some tools I use are able to read it out and interpret it. Those tools can interpret the ELF file, and obtain the information about the symbolic name of a variable/function and its address or even show a function prototype.
From my understanding the ELF and MAP files are basically containing the same information, it is just that the first one is binary and the latter one is the text file.
So what are the actual differences between these two files from the content perspective?
Thank you in advance!
The primary output of the linker (i.e. its main purpose) is to produce the fully linked executable code. That is the ELF (Executable Linkable Format) file. An ELF file may as you have observed contain symbols - these are used for debug. It may also contain meta-data that associates the machine code with the source code from which it was generated. But the bulk of its content (and the part that is not optional) is the executable machine code and data objects that are your application.
The MAP file is an optional information only human readable output that contains information about the location and size of code and data objects in your application. The MAP file includes a summary that shows the total size and memory usage of your code.
In an embedded cross-development environment, the symbol information in the ELF file is used when the code is loaded into a source-level symbolic debugger. The debugger takes the binary code/data segments in the ELF file and loads them onto the target (typically using a JTAG or other debug/programming hardware tool), it loads the symbols and source-level debug meta-data into the debugger, then while the real machine code is executing on the target, that execution is reflected in the debugger in the original source code where you can view, step and break-point the code at the source level.
In short, the ELF file is your program. The MAP file is, as its name suggests, a map of your executable - it tells you where things are in the executable.

Does a C program load everything to memory?

I've been practicing C today and something came to my mind. Whenever C code is ran, does it load all files needed for execution into memory? Like, does the main.c file and it's header files get copied into memory? What happens if you have a complete C program that takes up 1 GB or something large?
A C program is first compiled into a binary executable so header files, sources files, etc do not exist anymore at this point... unless you compiled your binary with debugging informations (-g flag).
This is a huge topic. Generally the executable is mapped into what's called virtual memory which allows to address more space than you have available in your computer's memory (through paging). When you will try to access code segments that are not yet loaded, it will create a page fault and the os will fetch what's missing. Compilers will often reorder functions to avoid executing code from random memory locations so you're, most of the time, executing only a small part of your binary.
If you look into specific domains such as HPC or embedded devices the loading policies will likely be different.
C is not interpreted but compiled language.
This means that the original *.c source file is never loaded at execution time. Instead, the compiler will process it once, to produce an executable file containing machine language.
Therefore, the size of source file doesn't directly matter. It may totally be very large if it contains a lot of different use cases, then producing a tiny executable because only the applicable case will be picked at compilation time. However, most of the time, the executable size remains correlated with its source, but it doesn't necessarily means that this will end up in something huge.
Also, included *.h headers file at top of C source files are not actually « importing » a dependence (such as use, require, or import would in other languages). #include statement is only here to insert the content of a file at a given point, but these files usually contain only function prototypes, variable declarations and some precompiler #define clauses, which form the API of an external resource that is linked later to your program.
These external resources are typically other object modules (when you have multiple *.c files within a same project and you don't need to recompile them all from scratch at each time), static libraries or dynamic libraries. These later ones are DLL files under Windows and *.so files under Unix. In this case, the operating system will automatically load the required libraries when you run your program.

What is more efficient with C code?

What is more efficient with the C code I am working with, moving my code into an existing C program or have an h file #included so it calls the separate .c file?
When this is compiled into an .exe how does this work with having it incorporated into the original code vs having an h file and separate .c file?
I am not sure how many lines of code the program has which I would incorporate this other code into but my code is only about a thousand lines of code.
Thanks,
DemiSheep
There's no difference in efficiency between keeping code in a single .c file or in multiple .c files linked into the same executable, since once the executable is created, chances are it will contain the same binary code whichever method you choose. This can be easily verified with a binary disassembler, of course.
What a single .c file can change, however, is the speed of compilation, which may be faster for a single file than for a bunch of files that have to be linked together. IIRC, one of the reasons SQLite's preferred source distribution method is a single huge "amalgamated C file" is compilation speed.
That said, you should really not concern yourself with issues of this kind. Do break your programs into separate modules, each with a clean interface and a separate implementation file.
I know this already has a good answer that says no difference but wanted to add this
The #include directive tells the
preprocessor to treat the contents of
a specified file as if those contents
had appeared in the source program at
the point where the directive appears.
I would have wrote it myself but the msdn docs say it quite nicely.

Resources