I'm learning how to use Data Display Debugger (DDD) for my C/C++ programs. The Help reference for DDD shows some sample outputs, including the following graphic graph / charting example. I'm trying to reproduce the exercise, but I'm having difficulty. The way it should work is I would compile cxxtest.c with debugger options, and the DDD tool would actually graph the variable array of interest during a step debugging session, in both 2D and 3D. Wow, if it works.
The cxxtest.c program is included in the DDD repository, ddd-3.3.12.tar.gz. I'm trying to compile and run that program but I keep getting stuck. I can't figure out how to generate a config.h file, so I can link in necessary support files (e.g. bool.h) to compile cxxtest.c
Files I see in the DDD repository, relating to config include:
config-info
config.h.in
config.texi
configinfo.C
configinfo.h
configure
configure.in
None of them seem to offer much help on how to generate a config.h file.
Anybody know how to generate a config.h file ?
Update: As I continue to work this one, the whole thing seems odd. The program , cxxtest.C , has a .C suffix, but there are distinctly C++ elements in there, #include <iostream> If I block the config.h thing, change the suffix to .cpp and compile I get a whole bunch of different errors. Not sure what the intent was here.
As for README content, I do see some instructions on how to compile the entire DDD tool, and it's quite lengthy. It's not clear on if preparing / configuring and compiling the DDD tool will also compile this particular test file. I guess I can wade thru the make files and scripts and see if this file every gets mentioned. (sigh!)
Actually I'm considering converting the entire file over to pure .c via rewrite. Note, the original file is visible here...
Note: I'm working in Virtualbox Ubuntu desktop for now... Ultimately I'd like to use the DDD tool to analyze key arrays in some digital signal processing (DSP) programs I'm working on.
Update #2:
I tried two different things here. First I built a C version of a file with the plot routines copied from the original cxxtest.c program. I converted all the calls to pure C. I could easily see the data in the DDD data window in text format. When I select the data set and then choose plot, I get a popup "DDD: Starting Plot... Starting gnuplot..." The system just hangs there.
Second, I did a complete clean install of the ddd tool. I had to install a few dependencies, and correct a few known bugs (e.g. #include <cstdio> ) but was successful at both $ ./configure && make and $ make check . The make check command does correctly build and compile cxxtest.c . When I run the file and do the steps to plot the dr and ir array variables, I get the same failure as above.
System hang. A search of the failure indicates this has been reported for years, apparently without resolve. Not quite sure how to proceed. This appears to be a total fail. I cannot reproduce the DDD test to plot graphical output. Anybody else make progress on this one?
Note: with this edit, I'm also removing the How do I generate config.h? from the title. That's not really the key issue here.
Anybody know how to generate a config.h file ?
Yes: just run the configure script provided. A typical sequence for building open source software is:
./configure && make
Related
If I am to compile my code in the terminal, I would do:
cc -std=c99 -Wall -Werror ....
If I'm to run my code in the terminal I would do:
./testprogram text.txt 1000 1000
The numbers and the text file is significant to the program.
Hopefully that gives some insight into the version of c I'm using and how to execute my program. Now: how do you debug this with visual studio code? I have installed the c/c++ extension.
Every time I try to start debug it asks me to chose between two environments:
c++ (GDB/LLDB)
c++ (windows).
My first problem is that non of those two options are just plain c, but maybe I just don't know better and there isn't much difference. Never the less, I gamble on one of the environment, usually the first one, then I'm asked to pick between:
two gcc-9 build and debug active file,
one gcc build and debug active file,
and defult configuration.
I usually pick the defult config one and that leads me to the launch.json page. I put in my program directory: "${workspaceFolder}/testprogram.c" and error.. :
Unable to start debugging. Launch options string provided by the project system is invalid. Unable to determine path to debugger. Please specify the "MIDebuggerPath" option.
Nothing I've tried so far or any googling have helped so hopefully someone with a lot of experience would be able to help out here. Thanks in advance.
I got a makefile project for Linux and I want to compile it on Win10 with Visual Studio 2019.
There are some paths to files defined in the makefile as preprocessor defines.
I have to replace the paths by own created files because they are a kind of PLATFORM_HEADER and I have to adapt a new one to Windows. In the code it looks like:
#include PLATFORM_HEADER
The onliest I tried is to add a property sheet to my project and add a macro:
macro page of property sheet But this macro is not found in the project.
Changing the code is not possible because it is third party code and it must be able to become updated at later times without doing changes again.
In other instructions there is noted that a line called 'inherited property sheet' has to be modified in the project. But in VS2019 this line does not exist.
Thank you for your help!
This sort of problem will be handled by your compiler suite. Probably you'll deal this with your build process manager (make, bitbake, cmake...). You can ask it to pass these #defines as argument for the compilation (-D name=definition).
Now you've unlocked the "very most of fun" if you compile something meant for linux on windows since there are many way that can go wrong. You might want to do a full check of your environment variable when compiling and make sure they point to the right system librairies.
Probably it's worth giving a try to the windows subsystem for linux and other bindings / emulator. If you want to preview a week of work's outcome, maybe you can do it in a linux VM? or just get rid of windows one good time for all :)
I am very new to coding, currently taking Harvard's CS50x class online. The extent of my familiarity with code, languages, and environments is what they've taught me so far in C. With vague guidance from other questions on this site, I've taken about 4 days to install gperf, from discovering what Cygwin is to installing all of its libraries and error-checking the installation all the way up to finding out where it finally put the installed program. I was so happy when I actually found the application 'gperf.exe' just now. I thought I was finally just about to get the hash function I've been trying to make for almost a week.
And now, the program does nothing but hang every time I try to run it, no error messages. Offering no input file causes it to hang. Doing any amount of and selection of options specified in the manual, it just hangs. Even debugging says it's entering debugging, then just hangs. The only way I've been able to get the program to respond at all is offering it an invalid input file, which it says is invalid. Nothing else does anything; no output file, no command-line response, nothing. I am frustrated to the point of tears, and the documentation provided with gperf assumes you're a professional coder, talking endlessly about the hundreds of high-level customization options to modify the program, but not a word about how to make it just run on a basic level. I've searched Google and this site extensively, and very little pops up as soon as I search for gperf related issues specifically.
Can anyone please just walk me through how to make this program work? I'm sure it's some stupid little thing that I'm missing, but all I want it to do is take my input file of strings and give me a hash function in C. Any and all help is appreciated, I have absolutely no clue what I'm doing and even installing gperf was a multi-day process that is far beyond the scope of what I've done so far.
Thank you.
EDIT:
Executions I've tried passing:
gperf
./gperf
Arguments I've tried passing:
-a
-c
-d
--output-file 'FILE'
I have tried all of these with and without the inclusion of my input file, named 'keys' and 'keys.txt'. The only thing that has generated any response from the program has been giving it an incorrect input-file name, giving the result 'could not load input file 'keys''.
gperf can be slow if the input file you give it is large, that is, contains many keys. You can get an impression of what are "small" files vs. "large" files by looking at the documentation: Known bugs and limitations.
I'm currently trying to add changes in how the SQLite virtual machine executes its code. To do that, I edit the vdbe.c file from the SQLite source.
The issue is, compiling SQLite consists in generating two huge implementation and header files (sqlite3.c and sqlite3.h) by amalgamating several smaller ones, after parsing some of them to generate code and documentation.
Unfortunately, the amalgamation process takes a relatively long time (about 15 seconds). I was wondering if there would be a somewhat easy way to not compile everything every time like it currently is the case, and possibly save a lot of compile time.
The main difficulty stems from the fact that source files are not valid by themselves (they can only compile once they have been amalgamated so that some types have already been defined earlier in the amalgamated file). After several attempts with a simple hand-written Python script (that would simply extract the virtual machine execution code from the amalgamation and keep the rest together), I came to the conclusion that there are two many edge cases to do it this way. I don't really know how to proceed.
Any suggestions are welcome.
I'd say: check in the amalgamation into your source code repository, treat it as your own artifact, and work on it. Whenever you wish to update the amalgamation, use git to help you.
Create two branches sqlite-upstream and sqlite-local.
Check in upstream amalgamation "v1" to sqlite upstream, then merge that to sqlite-local and do whatever local changes you need in that branch.
When upstream releases "v2", commit that to sqlite-upstream.
Merge or rebase - you may have some conflicts to resolve, but those will be much easier to deal with than manual change tracking.
Merge sqlite-upstream into sqlite-local, or
Duplicate sqlite-local into sqlite-local-v2, then rebase it onto sqlite-upstream, then use sqlite from that branch in dependent code.
So in case anyone needs the answer in the future, here is how I did. You can find the whole discussion on SQLite Forums.
After getting the source:
I have a first make pass where I don't change any file (to compile lemon, possibly among others):
make -f Makefile.linux-gcc
Then, I edit Makefile.linux-gcc and replace all occurrences of gcc with gcc -x none
I change main.mk and change the executable target by adding -lstdc++ at the end:
sqlite3$(EXE): shell.c libsqlite3.a sqlite3.h
$(TCCX) $(READLINE_FLAGS) -o sqlite3$(EXE) $(SHELL_OPT) \
shell.c libsqlite3.a $(LIBREADLINE) $(TLIBS) $(THREADLIB) -lstdc++
I run make again using make -f Makefile.linux-gcc and get my sqlite3 executable as expected.
I'm doing some Linux kernel development, and I'm trying to use Netbeans. Despite declared support for Make-based C projects, I cannot create a fully functional Netbeans project. This is despite compiling having Netbeans analyze a kernel binary that was compiled with full debugging information. Problems include:
files are wrongly excluded: Some files are incorrectly greyed out in the project, which means Netbeans does not believe they should be included in the project, when in fact they are compiled into the kernel. The main problem is that Netbeans will miss any definitions that exist in these files, such as data structures and functions, but also miss macro definitions.
cannot find definitions: Pretty self-explanatory - often times, Netbeans cannot find the definition of something. This is partly a result of the above problem.
can't find header files: self-explanatory
I'm wondering if anyone has had success with setting up Netbeans for Linux kernel development, and if so, what settings they used. Ultimately, I'm looking for Netbeans to be able to either parse the Makefile (preferred) or extract the debug information from the binary (less desirable, since this can significantly slow down compilation), and automatically determine which files are actually compiled and which macros are actually defined. Then, based on this, I would like to be able to find the definitions of any data structure, variable, function, etc. and have complete auto-completion.
Let me preface this question with some points:
I'm not interested in solutions involving Vim/Emacs. I know some people like them, but I'm not one of them.
As the title suggest, I would be also happy to know how to set-up Eclipse to do what I need
While I would prefer perfect coverage, something that only misses one in a million definitions is obviously fine
SO's useful "Related Questions" feature has informed me that the following question is related: https://stackoverflow.com/questions/149321/what-ide-would-be-good-for-linux-kernel-driver-development. Upon reading it, the question is more of a comparison between IDE's, whereas I'm looking for how to set-up a particular IDE. Even so, the user Wade Mealing seems to have some expertise in working with Eclipse on this kind of development, so I would certainly appreciate his (and of course all of your) answers.
Cheers
Eclipse seems to be pretty popular for Linux kernel development:
http://cdtdoug.blogspot.com/2008/12/linux-kernel-debugging-with-cdt.html
http://jakob.engbloms.se/archives/338
http://revver.com/video/606464/debugging-the-linux-kernel-using-eclipsecdt-and-qemu/
I previously wrote up an answer. Now I come up with all the details of the solution and would like to share it. Unfortunately stackoverflow does not allow me to edit the previous answer. So I write it up in this new answer.
It involves a few steps.
[1] The first step is to modify linux scripts to leave dep files in. By default after using them in the build, those dep files are removed. Those dep files contains exact dependency information about which other files a C file depends. We need them to create a list of all the files involved in a build. Thus, modify files under linux-x.y.z/scripts to make them not to remove the dep files like this:
linux-3.1.2/scripts
Kbuild.include: echo do_not_rm1 rm -f $(depfile);
Makefile.build: echo do_not_rm2 rm -f $(depfile);
The other steps are detailed in my github code project file https://github.com/minghuascode/Nbk/blob/master/note-nbkparse. Roughly you do:
[2] Configure with your method of configuration, but be sure use "O=" option to build the obj files into a separate directory.
[3] Then use the same "O=" option and "V=1" option to build linux, and save make output into a file.
[4] Run my nbkparse script from the above github project. It does:
[4.1] Read in the make log file, and the dep files. Generate a mirroring command.
[4.2] Run the mirroring command to hard-link the relevant source files into a separate tree, and generate a make-log file for NetBeans to use.
Now create a NetBeans C project using the mirrored source tree and the generated log file. NetBeans should be able to resolve all the kernel symbols. And you will only see the files involved in the build.
The Eclipse wiki has a page about this: HowTo use the CDT to navigate Linux kernel source
I have been doing some embedded linux development. Including kernel module development and have imported the entire linux kernel source code into Eclipse, as a separate project. I have been building the kernel itself outside of Eclipse(so far), but I don't any reason why I shouldn't be able to set up the build environment within Eclipse to build the kernel. For my projects, as long as I setup the PATH properties to point to the appropriate linux source include directories, it seems to be pretty good about name completion for struct fields, etc.
I can't really comment, on if it is picking up the correct defines and not greying out the correspond sections, as I haven't really paid to much attention to the files within the kernel itself.(so far)
I was also wondering about using Netbeans as a linux 'C' IDE, as I do prefer Netbean's for Java GUI development.
I think this would work (done each step for various projects):
[1] Modify kernel build scripts to leave .d files. By default they are removed.
[2] Log the build process to a file.
[3] Write a script to parse the build log.
[3.1] From the build log, you know every .c files.
[3.2] From the .c file, you know which is the corresponding .d file.
[3.3] Look into .d files to find out all the included .h files.
[3.4] Form a complete .c and .h file list.
[4] Now create a new dir, and use "ln -s" or "ln" to pick files of interest.
Now, create a Netbeans project for existing source code in the [4].
Configure code assistance to use make-log file. You should see
exactly the effective source code as when you build it at [2].
Some explanations to the above steps:
At [2], do a real build so the log file contains the exact files and flags of interest.
Later netbeans will be able to use the exact flags to parse.
At [4], pick only the files you want to see. Incorporating the whole kernel tree into netbeans will be unpractical.
There is a trick to parsing .d files: Many of the depended items are not real paths to a .h file, they are a modified entry for part of the linux config sections in the auto config file. You may need to reverse the modification to figure out which is the real header file.
Actually there is a topic on netbeans site. This is the discussion url: http://forums.netbeans.org/ntopic3075.html . And there is a wiki page linked from the discussion: wiki.netbeans.org/CNDLinuxKernel . Basically it asks you to prefix make with CFLAGS="-g3 -gdwarf-2" .
I found this link very helpful in setting up proper indexing in Eclipse. It requires running a script to alter Eclipse environment to match your kernel options, in my case
$ autoconf-to-eclipse.py ./include/generated/autoconf.h .
An illustrated guide to indexing the linux kernel in eclipse