I find tons of descriptions how to generate a compilation database (especially compile_commands.json) for a C/C++ project.
But the other way round: how to use it as input for a build via make or similar from command line?
Related
Previously, I asked and received advice on invoking Clang Static Analyzer for doing cross-translation-unit analysis. But this is now a separate issue.
What I want to ask here, do I need to include linker commands when using the newer CodeChecker?
CodeChecker dev here. Linker information is not used during the analysis. You can read about the CTU mode in this user guide: https://github.com/Ericsson/codechecker/blob/master/docs/analyzer/user_guide.md#cross-translation-unit-ctu-analysis-mode
The workflow when using CTU mode is more or less the same, like without it. The standard workflow is to do a CodeChecker log then a CodeChecker analyse then a CodeChecker parse or store to view the results, just add the --ctu flag to the analysis command.
I already have a makefile project in C. There are multiple makefiles that are used through the command line to help automate the build process as multiple folders are available in the project.
How can the same makefiles be used in my Eclipse project for building / compiling the code?
Create a "Makefile project with existing code". Its project tree will include all the makefiles, given that they have common names.
Then use the view "Build targets" (menu "Windows" > "Show Views" > "Other") to create build targets in the respective directories. Please read its documentation to make the most of it. It is generally a good idea to read at least some of the introducing articles of the built-in help (menu "Help" > "Help Contents").
You might want to visit the project's or the general preferences for the build tool. For example, I like to unset "Stop on first build error" to get the make option -k.
I use this regularly, and it works really perfectly.
Note 1: If you deselect the standard make tool when you create a new target, you can even change the build command. On Windows you can even enter the name of a batch file, or any other tool. These build targets are really versatile.
Note 2: This Tutorial: Makefile Projects with Eclipse shows some of this in a nutshell.
I have mostly used kate, vim etc to code and just pure console and gdb (rarely) to test. I want to start using eclipse, mainly for ease of looking up stuff, and hopefully (while not super important,) run the GUI debugger.
However, I don't want eclipse to touch my real project folders (it should change the code, obviously, but don't want it to create any configuration folders etc). Is that possible? I was thinking to create a workspace on a different folder, and add sources from my project path. But this seem to be complicated without any experience with eclipse when handling Makefiles etc!
Anyone has done something similar? any guidelines?
Yes, this is fairly straightforward. Instead of creating a standard C Project that creates and manages makefiles for you, use the "Makefile Project with Existing Code" instead.
If you don't want the .cproject, .project, etc files intermixed, create the CDT project in an empty directory and use Linked Files and Folders to pull in what you do want in the project.
If you do try to do a Build within Eclipse it will do "make all", but if you don't have a Makefile you get this (same for clean):
make all
make: *** No rule to make target `all'. Stop.
What I have done is for projects that don't have a make equivalent (like CPython extension) is to write a trivial Makefile that delegates all and clean targets to my real tool.
Once you have the project created, you will need to configure it to get all the goodness of CDT. The CDT Indexer and Scanner needs to know about your compiler settings (includes and defines really). There are two ways of delivering that information:
Run a verbose build (i.e. with gcc command line arguments echoed) from within Eclipse (e.g. use trivial Makefile described above). CDT will parse that output and automatically pick up compiler options used.
There are a number of ways that CDT can learn about what your settings are, to configure how they are picked up, head to project properties -> C/C++ General -> Preprocessor Include Paths, Macros etc. and adjust the sources in the Providers tab:
In the project properties, edit the C/C++ General -> Paths and Symbols properties. You may have to do this if CDT cannot determine all your settings in Step 1 too. This is a screenshot of some of those settings:
Suppose you have some source code that comes from the unix world. This source consists of a few files which will create a library and a lot of small .c files (say 20 or so) that are compiled into command-line tools, each with their own main() function, that will use the library.
On unixy systems you can use a makefile to do this easily but the most naive transformation to the windows / Visual Studio world involves making a separate project for each tool which, although it works, is a lot of work to set up and synchronize and more difficult to navigate at both the filesystem and project/solution level. I've thought about using different configurations where all but one .c file are excluded from the build but that would make building all the tools at once impossible.
Is there a nice way of building all the tools from a single "thing" (project, msbuild file, etc.)?
I'm really not interested in using cygwin's gcc/mingw or NAnt. I'd like to stick with the standard Windows toolchain as much as possible.
You don't HAVE to use visual studio to compile code. You can make your own batch file or Powershell script that simply calls the compiler on your source, just like a makefile.
So I've been looking into this for a while now and the solutions all leave much to be desired.
You can...
Create a lot of small projects by hand.
Use MSBuild and deal with its steep learning curve.
Use a build tool that does not integrate well with Visual Studio, like GNU make.
You can't even make a project template like you can with .NET projects! Well, you can make a wizard if you want to wade through the docs on doing that I suppose. Personally, I have decided to go with the "many small projects" solution and just deal with it. It turns out it can be less horrible than I had thought, though it still sucks. Here's what I did in Visual Studio 2008:
Create your first Win32 command line tool project, get all your settings down for all platforms and make sure it works under all circumstances. This is going to be your "template" so you don't want to edit it after you've made 20 copies.
(optional) I set up my paths in the visual studio project files so that everything is built in the project directory, then I have a post-build step copy just the dll/exe/pdb files I need to $(SolutionDir)$(OutDir). That way you can jump into a single directory to test all your tools and/or wrap them up for a binary distribution. VS2008 seems to be insane and drops output folders all over the place, with the default locations of Win32 and x64 output differing. Spending a few minutes to ensure that all platforms are consistent will pay off later.
Clean up your template. Get rid of any user settings files and compiler output.
Copy and paste your project as many times as you need. One project per tool.
Rename each copied project folder and project file to a new tool name. Open up the project file in a text editor like Notepad++. If you have a simple, 1-file project you'll need to change the project name at two places at the beginning of the file and the source code file name(s) at the end of the file. You shouldn't need to touch the configuration stuff in the middle.
You will also need to change the GUID for the project. Pop open guidgen.exe (in the SDK bin directory) and use the last radio button setting. Copy and paste a new GUID into each project file at the top. If you have dependencies, there will be one or more GUIDs at the bottom of the file near the source code. Do NOT change them as they are the GUIDs from the dependencies and have to match!
Go into Visual Studio, open up your main solution and add your tool projects.
Go into the configuration manager and make sure that everything is correct for all supported platforms, then test your build.
It's not beautiful, but it works and it's very much worth the setup time to be able to control your builds from the GUI. Hopefully VS2010 will be better about this, but I'm not too hopeful. It looks like MS is giving a lot more love to the .NET community than the C/C++ community these days.
If you have a makefile you can use a 'makefile' project in Visual Studio (which in misnamed - it simply allows you to specify custom build/debug commands), and use it to invoke GNU make.
You will need to change the makefile to use the VC++ command line tools instead of cc or gcc or whatever it uses, but often these are specified by macros at the top of the makefile.
If the makefile uses other Unix specific commands (such as rm), you may need to make modifications, or create bath files to map commands to Windows equivalents. Another option is to install any necessary tools from GNUWin32 to make it work.
If the build is very complex or involves configure scripts, then you have a harder task. You could generate the makefile from a configure script using MSYS/MinGW, and then modify it as above to make it work with VC++.
Makefile projects will not be as tightly integrated in Visual Studio however. All the build management is down to you and the makefile.
If you're really using Visual Studio, I would suggest creating a project for each tool, and adding these projects to a single solution. From Visual Studio, it's easy to build a complete solution all at once, and MSBuild knows how to build .sln files as well.
msbuild myslnfile.sln
or even:
msbuild
... will build your solution.
I'm doing some Linux kernel development, and I'm trying to use Netbeans. Despite declared support for Make-based C projects, I cannot create a fully functional Netbeans project. This is despite compiling having Netbeans analyze a kernel binary that was compiled with full debugging information. Problems include:
files are wrongly excluded: Some files are incorrectly greyed out in the project, which means Netbeans does not believe they should be included in the project, when in fact they are compiled into the kernel. The main problem is that Netbeans will miss any definitions that exist in these files, such as data structures and functions, but also miss macro definitions.
cannot find definitions: Pretty self-explanatory - often times, Netbeans cannot find the definition of something. This is partly a result of the above problem.
can't find header files: self-explanatory
I'm wondering if anyone has had success with setting up Netbeans for Linux kernel development, and if so, what settings they used. Ultimately, I'm looking for Netbeans to be able to either parse the Makefile (preferred) or extract the debug information from the binary (less desirable, since this can significantly slow down compilation), and automatically determine which files are actually compiled and which macros are actually defined. Then, based on this, I would like to be able to find the definitions of any data structure, variable, function, etc. and have complete auto-completion.
Let me preface this question with some points:
I'm not interested in solutions involving Vim/Emacs. I know some people like them, but I'm not one of them.
As the title suggest, I would be also happy to know how to set-up Eclipse to do what I need
While I would prefer perfect coverage, something that only misses one in a million definitions is obviously fine
SO's useful "Related Questions" feature has informed me that the following question is related: https://stackoverflow.com/questions/149321/what-ide-would-be-good-for-linux-kernel-driver-development. Upon reading it, the question is more of a comparison between IDE's, whereas I'm looking for how to set-up a particular IDE. Even so, the user Wade Mealing seems to have some expertise in working with Eclipse on this kind of development, so I would certainly appreciate his (and of course all of your) answers.
Cheers
Eclipse seems to be pretty popular for Linux kernel development:
http://cdtdoug.blogspot.com/2008/12/linux-kernel-debugging-with-cdt.html
http://jakob.engbloms.se/archives/338
http://revver.com/video/606464/debugging-the-linux-kernel-using-eclipsecdt-and-qemu/
I previously wrote up an answer. Now I come up with all the details of the solution and would like to share it. Unfortunately stackoverflow does not allow me to edit the previous answer. So I write it up in this new answer.
It involves a few steps.
[1] The first step is to modify linux scripts to leave dep files in. By default after using them in the build, those dep files are removed. Those dep files contains exact dependency information about which other files a C file depends. We need them to create a list of all the files involved in a build. Thus, modify files under linux-x.y.z/scripts to make them not to remove the dep files like this:
linux-3.1.2/scripts
Kbuild.include: echo do_not_rm1 rm -f $(depfile);
Makefile.build: echo do_not_rm2 rm -f $(depfile);
The other steps are detailed in my github code project file https://github.com/minghuascode/Nbk/blob/master/note-nbkparse. Roughly you do:
[2] Configure with your method of configuration, but be sure use "O=" option to build the obj files into a separate directory.
[3] Then use the same "O=" option and "V=1" option to build linux, and save make output into a file.
[4] Run my nbkparse script from the above github project. It does:
[4.1] Read in the make log file, and the dep files. Generate a mirroring command.
[4.2] Run the mirroring command to hard-link the relevant source files into a separate tree, and generate a make-log file for NetBeans to use.
Now create a NetBeans C project using the mirrored source tree and the generated log file. NetBeans should be able to resolve all the kernel symbols. And you will only see the files involved in the build.
The Eclipse wiki has a page about this: HowTo use the CDT to navigate Linux kernel source
I have been doing some embedded linux development. Including kernel module development and have imported the entire linux kernel source code into Eclipse, as a separate project. I have been building the kernel itself outside of Eclipse(so far), but I don't any reason why I shouldn't be able to set up the build environment within Eclipse to build the kernel. For my projects, as long as I setup the PATH properties to point to the appropriate linux source include directories, it seems to be pretty good about name completion for struct fields, etc.
I can't really comment, on if it is picking up the correct defines and not greying out the correspond sections, as I haven't really paid to much attention to the files within the kernel itself.(so far)
I was also wondering about using Netbeans as a linux 'C' IDE, as I do prefer Netbean's for Java GUI development.
I think this would work (done each step for various projects):
[1] Modify kernel build scripts to leave .d files. By default they are removed.
[2] Log the build process to a file.
[3] Write a script to parse the build log.
[3.1] From the build log, you know every .c files.
[3.2] From the .c file, you know which is the corresponding .d file.
[3.3] Look into .d files to find out all the included .h files.
[3.4] Form a complete .c and .h file list.
[4] Now create a new dir, and use "ln -s" or "ln" to pick files of interest.
Now, create a Netbeans project for existing source code in the [4].
Configure code assistance to use make-log file. You should see
exactly the effective source code as when you build it at [2].
Some explanations to the above steps:
At [2], do a real build so the log file contains the exact files and flags of interest.
Later netbeans will be able to use the exact flags to parse.
At [4], pick only the files you want to see. Incorporating the whole kernel tree into netbeans will be unpractical.
There is a trick to parsing .d files: Many of the depended items are not real paths to a .h file, they are a modified entry for part of the linux config sections in the auto config file. You may need to reverse the modification to figure out which is the real header file.
Actually there is a topic on netbeans site. This is the discussion url: http://forums.netbeans.org/ntopic3075.html . And there is a wiki page linked from the discussion: wiki.netbeans.org/CNDLinuxKernel . Basically it asks you to prefix make with CFLAGS="-g3 -gdwarf-2" .
I found this link very helpful in setting up proper indexing in Eclipse. It requires running a script to alter Eclipse environment to match your kernel options, in my case
$ autoconf-to-eclipse.py ./include/generated/autoconf.h .
An illustrated guide to indexing the linux kernel in eclipse