Testing Frameworks for C [closed] - c

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
After doing some work with Ruby, Rails, and RSpec last summer and I learned to TATFT. Now I can't write code without writing tests first.
I'm taking a programming course in C next year, and I would like to learn to write C test-driven. Is it a good idea (or even possible) to do TDD with C? If so, are there any good testing frameworks compatible with C?

Is it a good idea (or even possible) to do TDD with C?
Yes, it obviously is a good idea, as with other languages. However, due to the procedural nature of the language, it comes with its some more difficulties.
Static functions quickly get in the way. This can be solved by including the source file under test, or defining a STATIC macro that only means static when compiling the code for production - not unit test
#if defined(UNIT_TEST)
#define STATIC
#else
#define STATIC static
#endif
There is no isolation: there is only one global context. With an OO language you can just instantiate one object (or a cluster of collaborating objects) to test it (them), you can also use mock objects.
With C, you can, however, override functions just by re-defining them in your unit tests. This works fine on Unix-like systems where the linker invokes the first function he is finding - I'm not sure on Windows.
If so, are there any good testing
frameworks compatible with C?
You can start small with minunit. The learning curve is flat as it only is four macros long.
EDIT: There are two lists of UT frameworks for the C language, that were mentioned in other answers and I didn't repeat : one on Wikepedia and another one on xprogramming.com.

We use "check" from http://check.sourceforge.net/, it provides basic functionality for testsuites and tests (similiar to junit), but is fairly lightweight. On feature I like is that it handles if your test dumps code and considers that a failure.
Also note "check" is a "C" based framework rather than a "C++" one.

I just discovered CSpec, which does BDD in C. Doesn't look very mature, but it reminds me of RSpec.

There are a number of unit testing harnesses for C. Wikipedia has a much better list than I could assemble here.

If you are actually using a C++ compiler, but using it in 'C' mode by compiling .c files, then, also, any of the C++ unit test frameworks will work OK.
Take a look at the original list of xUnit frameworks at http://www.xprogramming.com/software.htm

This similar question also has a lot of answers "Unit Testing C Code"
I used RCUNIT, it is mature and has everything I need. I have also used ipl canata which is great but is very expensive so that is probability not what you want.

You certainly can do unit testing in C (I do). The framework I use (for the Windows platform) is CunitWin32

Here is the list of unit test frameworks for c:
http://en.wikipedia.org/wiki/List_of_unit_testing_frameworks#C
enjoy it!

So a proper C programmer will tell you that because C is statically typed it catches all bugs you might have and therefore you don't need a unit test framework.
They are full of shit, but that's the argument for statically type languages like C.
I think you should probably take the approach that Adobe did with Photoshop. Write a series of core libraries in C, and then all the glue and real logic of the application should be in a higher level language. Photoshop is mostly written in Lua, but many languages work for this.

Related

How much lisp to implement in C before writing extension in itself?

I am implementing a lisp interpreter in C, i have implemented along with few primitives like cons , car, cdr , eq, basic arithmetic stuff.
Just before i was starting to implement define and lambda it occurred to me that i need to implement an environment. I am unsure if i could implement it in lisp itself.
My intent is to implement minimal amount of lisp so that i could write extension to the language in itself. I am not sure how much is minimal, Would implementing FFI Qualify as minimal ?
The answer to your question depends on the meaning that you give to the word “minimal”.
Given your question, and assuming that you don't want to make an implementation competing with the nowdays fine implementations of Common Lisp and Schema, my hypothesis is that with “minimal” you intend: Turing complete, that is capable of expressing any computation expressible in a general purpose programming language.
With this assumption, you need to implement three other things:
conditional forms (cond)
lambda expressions (lambda)
a way of defining recursive lambda expression (labels or defun)
Your interpreter then should be able to evaluate forms. This should be sufficient to have a language equivalent to the initial LISP, that allow to express in the language any computable function.
First off, you are talking about first writing a LISP interpreter. You have a lot of choices to take when it comes to scoping, LISP1 vs LISP2 since these questions alter the implementation core. An interpreter is a general purpose program that reads and evaluates code. It can support abstractions but it won't extend itself by making more native stuff.
If you are interested in such stuff you can perhaps make a compiler instead. Eg. there are many Sceme like subsets that compiles to C or Java code, but you can make your own VM. Thus it can indeed compile itself to be run on it's own target machine (self hosting) if all the forms and procedures you use has been implemented using the primitives supported by the compiler.
Making a dumb compiler is not much difference from making an interpreter. That is very clear if yo've watched the SICP videos (10A is about compilation, 7A-B is about interpreters)
The environment can be a chain of pairs just as in a LISP interpreter. It would be difficult to implement the environment of itself in LISP without making it a very difficult Lisp language to use (unless it's compiled that is)
You may use the data structures of lisp and the primitives from the C code though.
Making a FFI is a fast way to give your language lots of features. It solves the chicken and egg problem by using other peoples work from within your language. In fuses the top (primitives and syntax) and the bottom layer (a runtime) of your system. It's the ultimate primitive and you can think of it as system call or message bus to the runtime.
I strongly suggest to read Queinnec's book: Lisp In Small Pieces. It is a book dedicated entirely to answer your question, and it explains in detail the many trade-offs and the internals of Lisp implementations and definitions, by giving many explained examples of Lisp interpreters and compilers.
You might also consider using libffi. You could be interested in the internals of M.Serrano's Bigloo & Hop implementations. You might even look inside my MELT lisp-like language to customize the GCC
compiler.
You also need to learn more about garbage collection (you might read the GC handbook). You could use Boehm's conservative Garbage Collector (or something else, e.g. my Qish or MPS) or write your own GC.
You may want to learn more about Chicken, Scheme 48, Guile and read their papers and look inside their code.
See also J.Pitrat's blog: it is not about Lisp (but about bootstrapping strong AI) and has several fascinating entries related to bootstrapping.

Advantages of function prototyping [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
After half an hour of research on the Internet, I couldn't find any reasoned discussion of the advantages of function prototyping.
I manage in Java/Android, and am beginning a C course. Prototyping looks cumbersome compared to my previous experience, and I would like to know the reason(s) why it still exists in 2013.
I understand that life was more difficult for Ritchie and pals; however, a compiler could be written today that would generate a list of functions in a first pass, then do its usual thing using that list of functions as a current compiler would use a header file.
It probably can't persist either only because of backwards compatibility. It would be feasible to create a compiler that could switch between current operation mode, and the hypothetical new mode I just described, depending on the code it is shown.
If prototyping persists, it must therefore have an interest for the programmer, not for the compiler programmer. Am I right or wrong - and where can I find a reasoned discussion of the advantages of function prototyping vs. no prototyping?
You're forgetting that in C you can call a function whose source you don't have.
C supports binary distribution of code, which is quite common for (commercial) libraries.
You get a header that declares the API (all functions and data types) and the code in a .lib (or whatever your platform uses) file. This is typically the case for all of C's standard library; you don't always get the source to the compiler vendor's library but you must still be able to call the functions, of course.
For that to work, the C compiler must have the declarations when processing your code, so it can generate the proper arguments for the call, and of course deal with any return value correctly.
It's not enough to just rely on your source, since if you do
GRAPHICSAPI_SetColorRGB(1, 1, 1);
but the actual declaration is:
void GRAPHICSAPI_SetColorRGB(double red, double green, double blue);
the compiler cannot magically convert your int arguments to double if it doesn't have the prototype. Of course, having the prototype makes it possible to error-check that the call makes sense, which is very valuable.
Interesting idea about having the compiler have a first look over all source files to take notice of all functions prototypes.
However
libraries (object code) need to have their declarations somewhere, this is why the includes exist
Also I find convenient to be able to grep the includes as "free text", like
grep alloc /usr/includes/*

Pascal to C converter [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm writing program which translate Pascal to C and need some help. I started with scanner generator Flex. I defined some rules and created scanner which is working more or less ok. It breaks Pascal syntax into tokens, for now it's only printing what it found. But I have no idea what should I do next. Are there any articles or books covering this subject? What is the next step?
Why do you want to do such a Pascal to C converter?
If you just want to run some Pascal programs, it is simpler to use (or improve) existing compilers like gpc, or Pascal to C translators, like e.g. p2c
If you want to convert hand-written Pascal code to humanly-readable (and improvable) C code, the task is much more difficult; in particular, you probably want to convert the indentation, the comments, keep the same names as much as possible -but avoiding clashes with system names- etc!
You always want to parse some abstract syntax tree, but the precise nature of these trees is different. Perhaps flex + bison or even ANTLR may or not be adequate (you can always write a hand-written parser). Also, error recovery may or not be important to you (aborting on the first syntax error is very easy; trying to make sense of an ill-written syntactically-incorrect Pascal source is quite hard).
If you want to build a toy Pascal compiler, consider using LLVM (or perhaps even GCC middle-end and back-ends)
You might want to take a look at "Translating Between Programming Languages Using A Canonical Representation And Attribute Grammar Inversion" and references therein.
The most common approach would be to build a parse tree in your front end, and then walk through that tree outputting the equivalent C in the back end. This gives you the flexibility to perform any reordering of declarations that's required (IIRC Pascal supports use before declaration, but C doesn't). If you're using flex for the scanner, tradition would dictate using bison for the parser, although there are alternatives. If you look, you can probably find a freely available Pascal syntax in the format expected by bison.
You have to know the Pascal grammar, the C grammar and built (design) a "something" (i.e. a grammar or an automata...) that can translate every Pascal rule in the corresponding C rule.
Than, once you have your tokenized stream, using some method like LR, you can find the semantic tree which correspond to the sequence of Pascal rule applied and convert every rule in the corresponding C rule (this can be easly done with Bison).
Pay attention that Pascal and C have not Context Free grammars, so more control will be necessary.

Does C language supports inheritance? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Suppose we have a header file "add.h" with "add(int,int)" function , "subtract.h" header file with "subtract(int,int)" function .Suppose we have a header file "calc.h" as follows-:
--------------add.h-------------
int add(int a,int b)
{
return (a+b);
}
-------------sub.h--------------
int sub(int a,int b)
{
return (a-b);
}
-------------calc.h-------------
#include "add.h"
#include "sub.h"
------------program.c-----------
#include "calc.h"
#include stdio.h
#include conio.h
void main()
{
printf ("%d",add(1,2));
printf ("%d",sub(3,1));
}
Can't we say that it is the form of inheritance where calc.h is inhering from add.h and sub.h and program.c is inheriting from calc.h?
I know this may be silly doubt to ask but I want to clarify my doubt?
Furthur Please tell me why should one prefer Object Oriented Prog. rather than procedural programming?
No, this is no inheritance. This is just simple inclusion. You can imitate inheritance in C with nested structs, a prominent example for this is the GTK framework. But C itself does not support inheritance.
Nobody says you should prefer OO. It's simply your choice. Many find it easier, especally in teams >2 people, to maintain OO code. But what you use is your choice.
Regarding the OO or procedural question:
OO has always been surrounded by a big hype. There are a lot of things associated with OO, though if you remove the fluff and hype features, OO boils down to three major corner stones: modular programming with autonomous modules that lack coupling to parts of the program they don't need to know about (very important), private encapsulation which prevents accidental or intentional access to variables that shouldn't be altered from outside the module, to prevent bugs and spaghetti code (important) and inheritance with polymorphism, which can make code easier to maintain and expand in the future (somewhat useful).
Modular programming isn't related to the language at all, it is simply part of good program design. Private encapsulation is supported by C through the static keyword and through incomplete type, although it is crude compared to OO languages with support for a private keyword. Inheritance isn't supported by C, though as mentioned by others you can achieve it with various struct tricks.
Then of course OO languages also come with a lot of other things such as constructors/destructors, interfaces, generic objects, operator overloading etc etc. You don't really need these things to write programs, though some of them make programming easier.
My answer to OO vs procedural is: You should use modular program design and private encapsulation. Both happen to be regarded as good OO practice, but nothing is stopping you from using them in a procedural program and nothing is stopping you from using them in C.
No, this is not inheritance. Inheritance would mean that calc.h can modify the implementation of add() and substract(), which it can't since it only contains declarations.
Even if you decided that substract.c contains a new implementation of add(), then you'd run into a problem because the linker wouldn't know which version of the function to choose from.

Which language is useful to create a report for a valid C program [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Can anyone suggest me a helpful programming language which can be used to create a tool which will analyse the given C program and generate a txt report or html report containing information about the given program (function list, variable list etc).
The program I intend to build is similar to doxygen but i want it for my personal use.
ctags, perhaps?
Ctags generates an index (or tag) file of language objects found in source files that allows these items to be quickly and easily located by a text editor or other utility. A tag signifies a language object for which an index entry is available (or, alternatively, the index entry created for that object).
Both Python and Perl have excellent string processing capabilities.
I'd suggest using something like ctags to parse the program, and just create a script to read the ctags file and output in txt/html.
The file format used by ctags is well-defined so that other programs can read it. See http://ctags.sourceforge.net for more information on ctags itself and the file it uses.
You're opening a big can of worms, this isn't an effective use of your time, blah blah blah, etc.
Moving on to an answer, if you're talking about anything beyond trivial analysis and you need accuracy, you will need to parse the C source code. You can do that in any language, but you will almost certainly want to generate your parser from a high-level grammar. There are any number of tools for that. A modern and particularly powerful parser generator is ANTLR; there are a number of ANTLR grammars for C, including easier-to-work-with subsets.
Look into scripting languages. I'd recommend Python or Perl.
Haskell has a relatively recent language-c project http://www.sivity.net/projects/language.c which allows the analysis of C code.
If you are familiar with Haskell, then it might be worth a look. Even if you are not, it might be interesting to have a go.
If it's a programming language you want then I'd say something which is known for string processing power so that would mean perl.
However the task you require can be rather complicated since you need to 'know' the language, so you would require to follow the same steps the compiler does, being lexical and grammatical analyses on the language (think flex, think yacc) in order to truly 'know' what meaning those strings have.
Perhaps the best starting point is to take a look at doxygen and try to reuses as much of the work done there as possible
Lex/yacc are appropriate for building parsers.
pycparser is a complete parser for ANSI C89/C90 written in pure Python. It's being widely used to analyze C source code for various needs. It comes with some example code, such as listing all the function definitions in files, etc.

Resources