Related
I want to write a library that is dynamically loadable and callable from C code, but I really don't want to write it in C - the code is security critical, so I want a language that makes it easier to have confidence that my code is correct. What are my options?
To be more specific, I want C programmers to be able to #include this, and -l that, and start using my library just as if I had written it in C. I'd like programmers in other languages to be able to use their favourite tools for linking to C libraries to link to it. Ideally I'd like that to be possible on every platform that supports C, but I'll settle for Linux, Windows and MacOS.
Anything that compiles to native code. So you might Google for that - "languages that compile to native code." See, e.g., Programming languages that compile to native code and have the batteries included
C++ is often the choice for this. Compiles to native code and provided you keep your interfaces simple, easy to write an adapter layer.
Objective C and Fortran are also possible.
It sounds like you are looking for a language with ABI compatibility or which can be described as resulting in native code. So long as it can be compiled to a valid object file (typically an .obj or .o file) which is accepted by the linker, that should be the main criteria. You also then want to write a header file as a convenience for any client code which is written in C (or a closely related language/variant thereof).
As mentioned by others, you need a pretty good reason for choosing a language other than C as it is the lingua-franco of low-level/systems software. Assembler is an option, although harder to port between platforms. D is a more portable - but less widespread - alternative which is designed to produce secure, efficient native code with a minimum of fuss. There are many others.
Almost every security critical application I know of is written in C. I don't believe that there are any other language that has higher real status in producing secure applications.
C is being said to be a poor language for security by people who don't understand.
If you want C programmers to use your library, use C. Doing anything else is tying one hand behind your back whilst trying to walk on a balance beam (the gymnastics equipment). Sure, there are dozens of other languages that are CAPABLE of interfacing to C, but it typically involves using a C layer and then stuffing the C data types into a language specific data type (Java Objects, Python Objects, etc, etc), and when the call is finished, you use the same conversion back to a C data type. Just makes it harder to work with, and potentially slower if you don't get all the design decisions right. And people won't understand the source code, so won't like to use it (see more about this below).
If you want security, then write very good code, wearing your "security aspects" hat firmly on at all times, find a security mailing list or website and post it there for review, take the review comments on board, understand the comments, and fix any comments that are meaningful to fix. Distribute the source code to the users, so people can see what your code does. Those that understand security will know what to look for and understand that you have done a good job (or a bad job, whichever is applicable) - and those who don't will hopefully trust the right pople. If it's good, people will use it. If it's "hidden", and not easy to access, you won't get many customers, no matter what language you use.
Don't worry, you won't reveal anything more from releasing source. If there is a flaw in the code, and it is popular (or important) enough, someone will find the flaw, even if you publish only binaries. For those skilled in reverse engineering, not having source code is only a small obstacle.
Security doesn't stem from using a specific language or a specific tool, it stems from good design and good basic understanding of the problems with security.
And remember security by obscurity (whether that means "hidden source code" or "unusual language" or something else obscure) is false security.
You might be interested in ATS, http://ats-lang.sourceforge.net/. ATS compiles via C, can be as efficient as C, and can be used in a way that is ABI-compatible with C. From the project website:
ATS is a statically typed programming language that unifies implementation with formal specification. It is equipped with a highly expressive type system rooted in the framework Applied Type System, which gives the language its name. In particular, both dependent types and linear types are available in ATS. The current implementation of ATS (ATS/Anairiats) is written in ATS itself. It can be as efficient as C/C++ (see The Computer Language Benchmarks Game for concrete evidence) and supports a variety of programming paradigms
ATS's dependent and linear type system helps produce static guarantees about your code, including various aspects of resource management safety.
Chris Double has been writing a series of articles exploring the power of ATS's type system for systems programming here: http://bluishcoder.co.nz/tags/ats/. Of particular note is this article: http://bluishcoder.co.nz/2012/08/30/safer-handling-of-c-memory-in-ats.html
This document covers aspects of calling back and forth between ATS and C code: https://docs.google.com/document/d/1W6DYQApEqKgyBzMbvpCI87DBfLdNAQ3E60u1hUiMoU0
The main downside is that dependently-typed programming is still a daunting prospect, even for non-systems programming. The syntax of the language is also a bit weird: consider lexical quirks such as the use of abst#ype as a keyword. Finally, ATS is to some degree a research project, and I personally don't know whether it would be sensible to adopt for a commercial endeavour.
Theoretically, it's going to be Fortran: less indirection (as in: my array is [here], not just a pointer to here, and this is true of most but not all of your data structures and variables).
However... There are many gotchas and quirks in Fortran: not, perhaps, as many as in C but you probably know your way around C rather better than Fortran. Which is the point behind most of the comments saying 'Know your code' - but do you really know what your compiler is doing?
Knowing you, I'm prepared to take it on trust that you do, for C. Most programers don't. You do not know and cannot know what a local JVM or JIT compiler does, and that's a black hole in your security model if you're using Java or C# r scripting languages.
Ignore anyone who tells you that the hairy-chested he-men of secure computing write their own assembler: they probably don't even know the security errors they're making in any and all nontrivial projects they release. Know your compiler, indeed.
You could write it in lua - providing a C API to a Lua library is relatively straight forward. C++ is also an option, though of course you'd have to write C wrappers and make sure no exceptions can escape your functions. But honestly, if it's security critical the minor inconveniences of the C language shouldn't be that much of a big deal. What you really should be doing is prove the correctness of your program where feasible, and test extensively where it's not.
You can write a library in Java. JNI is normally used to call C from Java, but it can be used the other way around.
There is finally a decent answer to this question: Rust.
I'm a web developer mostly working in Ruby and C#..
I wanna learn a low level language so i dont look like an ass infront of my (computer science expert) boss.
Ive heard a lot of purist buzz about how assembly language is the only way to learn how computers actually work, but on the other hand C would probably be more useful as a language rather than just for theory.
So my question is..
Would Learning C teach me enough computer science theory / low level programming to not look like a common dandy (complete tool)?
Thanks!
Daniel
Thanks guys!
Some really great answers,
I think i'll learn C just to get a grasp of memory management, but i think your right and i'll be better off studying more of the languages i use!
First learn the actual theory. Learning C's syntax means nothing if you can't do anything meaningful with it. After you have a solid grasp of how algorithms and data structures work, this knowledge will be appliable in most languages you'll probably use.
If you're planning to work in C# and Ruby, I don't see the point in learning assembler just for the sake of doing so.
You'll be fine learning just C since it's C.
You can do "real programming" in any language, as long as you do it right.
Would Learning C teach me enough
computer science theory / low level
programming to not look like a common
dandy (complete tool)?
You're using C# which can perform unmanaged pointer manipulation, not too far off what you can achieve in C++ land. For instance, games can be successfully programmed in C#, although most games will be C++.
Assembler is basically moving bits of memory around and sometimes doing a bad job of it too if you don't know what you're doing. C++/C compilers create quite good assembly code underneath, so why would you do it yourself unless you had to write low-level driver software. The only bit of assembler in our 2,000,000 lines of mixed C++/C# code is to detect VMWare.
I don't think in this modern age and given your description of your job there is much call for you to know about how many registers you can use on your processor. By all means learn C++, but as mentioned before, learning syntax is not the same as learning how to apply it.
At a minimum learn and understand patterns (Gang of four), OO design, understand the implications and benefits of different testing methodologies and the impact of all this on business costs.
Learning C will get you all the information you need and be able to accomplish things in a reasonable time.
I suggest you do a few meaningful projects in C, then one or two in an Assembler dialect, maybe for a Microcontroller or specializing in SSE/MMX.
In the end it all comes down to opcodes, registers, and addressing modes, of which C teaches you absolutely nothing. However, assemblers are tightly coupled to their platforms; learning how to write assembler on x86 won't help you much when you work on a Sparc or PA-RISC box.
C really isn't going to teach you much that other languages won't in terms of theory. The thing about C is that it doesn't provide many tools or abstractions beyond naked pointers and byte streams. You have to roll all your own containers (lists, stacks, queues, etc.), and if you want them to be generic, you have to figure out how to separate the code that needs to be type aware (assignments and comparisons) from the general algorithm. You have to manage your own memory. Text processing is a pain in the ass (C doesn't even provide a dedicated string type).
Is that something that would be useful to know for your day-to-day work, or to impress your boss? Maybe. I'm not sure how you would apply that knowledge in your C# or Ruby code, which operates in a virtual machine (as opposed to native binaries generated from C).
You want to impress your boss with Computer Science knowledge, read up on Turing machines and Markov algorithms.
I don't think knowledge of Assembler would make you a better Ruby / C# programmer.
You can go as low level as writing to a disk using a magnetized needle, but if your programs aren't working or are insecure, you wouldn't have gained anything by it.
Learning the syntax of a "new" language won't assist you in gaining more knowledge of the depth of programming.
Real programming is not about a particular low level programming languge. To understand how a computer works it would be not bad to know some CPU instructions and how the hardware processes them.
To become a good programmer it is absolutely not necessary to know how the hardware works, except you want to program hardware.
I wouldn't worry about looking like a fool in front of that boss if that's the criteria your boss has for not being a fool.
C and Assembler isn't the best languages if you want to learn computer science. As you know Ruby and C# I guess you know some object orientation.
"Real programming" works the same in every language and that's what you should really worry about. That being said, working up something in C is a very useful exercise/
Learning C won't necessarily teach you too much about how computers work, but it is a good access door to that, considering that it is still the language of choice for system programming.
Learning ASM is of course useful as well, but it's probably uncalled for if you want to apply it to your job. It might teach you a few useful concepts though, or at least help you get a better understanding of how CLR works (or in general, how bytecode compilation and bytecode-compiled code do their stuff). Maybe you could try learning ASM for an older/simpler platform; there's still a heavy Atari 2600 scene around, and due to the platform's inherent limitations, some of the hacks they do to squeeze some extra functions in a game are quite awesome. The modern x86_64 architecture is pretty complex and somewhat hairy, although that's more of a personal opinion than an actual fact. Learning to use a microcontroller might also get the job done, but be warned that many (most?) use a Harvard architecture (i.e. separate program and data memory) which is different from that of a typical general-purpose CPU.
ask your boss first, than do exactly what he tells you, finally ask again for feedback and review of the exercises you will do. act accordingly. repeat until he says "you now know more than me".
this is the more efficient way of exploiting the expertise of a real expert you have the luck to have, and will leave extremely positive impression forever.
it will be as difficult as learning C or assembler alone in your spare time.
I wanna learn a low level language so i dont look like an ass infront of my (computer science expert) boss.
This is like so true in many aspects, I learned c++, and haskell for similar reasons, I have no regrets since they taught me many things, but I do understand the peer pressure.
This is a 6 years old question, and it is interesting to see that no one mentioned a good textbook on programming that covers the fundamentals of computer science.
If you want to learn "real programming", just try to tackle
"Structure and Interpretation of Computer Programs"
by MIT Press in your free time. It should cover most if not all of your curiosity with respect to programming whatever your level is.
There is also "The Art of Computer Programming" by Donald Knuth. Knuth is especially well versed in machine languages and other low level stuff.
Beware though both of these works are very dense.
Here is my two cents on how to approach them. First of all, they are not "tutorials" that sugar coat your way towards a skill, they introduce a scientific discipline in an appropriate manner, so the mind set of "what am I going to do with this? How can I use this ?" should be replaced by "how does this work ? What does this do ? Why does it do it this way ?".
Try to do the exercises included in the books, but don't push yourself too hard. If you can't do it, it is okay, pass to the next one.
However if you are not doing any of the exercises given in a section, you should take your time with the chapter, and revise it so that you can at least do couple of the exercises included for the section.
Just keep this up, and you would caught up with your boss/peer in no time. Some chapters/exercises might even be a subject of conversation if you feel like it. He might even like the fact that you are acquiring new skills, or not, don't take my word for it, I do programming not to deal with people.
Well, "Real Programming" doesn't refer to a specific language, or low level languages. In my mind, "real Programming" is the process of analyzing a task, deciding the best way to solve it, and putting that method into a set of simple steps, and then revising and editing as necessary until you have achieved your goal. From that, the language doesn't really matter, it's all in the thought process. Furthermore, there is nothing wrong with high level languages, I use python all the time and love it. However, learning low level languages can really help develop your understanding programs and computers in general. would suggest c, because it is where c# and c++ stem from, and would provide a bases for understanding both those languages.
I am faced with the task of building a new component to be integrated into a large existing C codebase. The component is essentially a kind of compiler, and will be complicated enough that I would like to write it in OCaml (for reasons along the lines of those given here). I know that OCaml-C interaction is possible (as per the manual and this tutorial), but it looks somewhat painful.
What I'd like to know is whether others here have attempted large-scale integration of OCaml and C code, what were some of the unexpected gotchas they found, and whether at the end of the day they concluded that they would have been better off just writing the new code in C.
Note, I'm not trying to start a debate about the merits of functional versus imperative programming: let's just say we assume that OCaml happens to be the right tool for the job I have in mind, and the potential difficulty in integration is the only issue. I also don't have the option of rewriting the rest of the codebase.
To give a little more detail about the task: the component I need to implement is a certain kind of query optimizer that incorporates some research ideas my group at UC Davis is working on, and will be integrated into PostgreSQL so that we can run experiments. (A query optimizer is, essentially, a compiler.) The component would be invoked from C code, would function mostly independently but would make a certain number of calls to other PostgreSQL components to retrieve things like system catalog information, and would construct a complex C data structure (representing a physical query plan) as output.
Apologies for the somewhat open-ended question, but I'm hoping the community might be able to save me a little trouble :)
Thanks,
TJ
Great question. You should be using the better tool for the job.
If in fact your intentions are to use the better tool for the job (and you are sure lexx and yacc are going to be a pain) then I have something to share with you; it's not painful at all to call ocaml from c, and vice versa. Most of the time I've been writing ocaml calling C, but I have written a few the other way. They've mostly been debug functions that don't return a result. Although, the callings back and fourth is really about packing and unpacking the ocaml value type on the C side. That tutorial you mention covers all of that, and very well.
I'm opposed to Ron Savage remarks that you have to be an expert in the language. I recall starting out where I work, and within a few months, without knowing what a "functor" was, being able to call C, and writing a thousand lines of C for numerical recipes, and abstract data types, and there were some hiccups (not with unpacking types, but with garbage collection of an abstract data-types), but it wasn't bad at all. Most of the inner loops in the project are written in C --taking advantage of SSE, external libraries (lapack), tighter optimized loops, and some in-lined hand optimized assembly.
I think you might need to be experienced with designing a large project and demarcating functional and imperative sections. I would really assess how much ocaml you are going to be writing, and what kind of values you want to pass to C --I'm saying this because I'd be fearful of recommending to someone to pass a recursive data-structure from ocaml to C, actually, it would be lots of unpacking tuples, their contents, and thus a lot of possibility for confusion and bugs.
I one wrote a reasonably complex OCaml-C hybrid program. I was frustrated by what I found to be inadequate documentation, and I ended up spending too much time dealing with garbage collection issues. However, the resulting program worked and was fast.
I think there is a place for OCaml-C integration, but make sure it is worth the hassle. It might be simpler to have the programs communicate over a socket (assuming such IO operations won't eliminate the performance you want). It might also be more sane to just write the whole thing in C.
Interoperability is the achilles heel of standalone implementations of statically typed languages, particularly those without JIT compilation like OCaml. My own experience having been using OCaml for over 5 years is that the only reliable bindings are across simple APIs that do little more than pass large arrays, e.g. LAPACK. Even slightly more complicated bindings like those to FFTW took years to stabilize and others, like OpenGL and GLU, remain an unsolved problem. In particular, I found major bugs in binding code written by two of the authors of the OCaml compiler. If they cannot get it right then there is little hope for the rest of us...
However, all is not lost. The solution is simply to use looser bindings. Rather than handling interoperability at the C level with a low-level type-unsafe interface, use a high-level interface like XML-RPC with string passing or even over sockets. This is much easier to get right and, as you say, will let you leverage the enormous practical benefits offered by OCaml for this application.
My rule of thumb is to stick with the language / model / style used in the existing code-base, so that future maintenance developers inherit a consistent and understandable set of application code.
The only way I could justify something like what you are suggesting would be if:
You are an Expert at OCaml AND a Novice at C (so you'll be 20x as productive)
You have successfully integrated it with a C library before (apparently not)
If you are at all more familiar with C than OCaml, you've just lost any "theoretical" gain from OCaml being easier to use when writing a compiler - plus it seems at though you will have more peers familiar with C around you than OCaml.
That's my "grumpy old coder" 2 cents (which used to only cost a penny!).
I came across this: Writing a compiler using Turbo Pascal
I am curious if there are any tutorials or references explaining how to go about creating a simple C compiler. I mean, it is enough if it gets me to the level of making it understand arithmetic operations. I became really curious after reading this article by Ken Thompson. The idea of writing something that understands itself seems exciting.
Why did I put up this question instead of asking Google? I tried Google and the Pascal one was the first link. The rest did no seem relevant and added to that... I am not a CS major (so I still need to learn what all those tools like yacc do) and I want to learn this by doing and am hoping people with more experience are always better at these things than Google. I want to read some article written in the same spirit as the one I listed above but that which highlights at least the bootstrapping phases of building a simple C compiler.
Also, I don't know the best way to learn. Do I start off building a C compiler in C or some other language? Do I write a C compiler or some other language? I feel questions like this are better answered once I have some direction to explore. Any suggestions?
Any suggestions?
A compiler consists of three pieces:
A parser
An abstract syntax tree (AST)
An assembly code generator
There are lots of nice parser generators that start with language grammars. Maybe ANTLR would be a good place for you to start. If you want to stick to C roots, try lex/yacc or bison.
There are grammars for C, but I think C in its entirety is complex. You'd do well to start off with a subset of the language and work your way up.
Once you have an AST, you use it to generate the machine code that you'll run.
It's doable, but not trivial.
I'd also check Amazon for books about writing compilers. The Dragon Book is the classic, but there are more modern ones available.
UPDATE: There have been similar questions on Stack overflow, like this one. Check out those resources as well.
I advise you this tutorial:
LLVM tutorial
It is a small example on how to implement a "small language" compiler. The source code is very small and is explained step by step.
There is also the C front end library for the LLVM (Low Level Virtual Machine which represent the internal structure of a program) library:
Clang
For what it's worth, the Tiny C Compiler is a pretty full-featured C compiler in a relatively small source package. You might benefit from studying that source, as it's probably significantly easier to understand than trying to comprehend all of GCC's source base, for instance.
This is my opinion (and conjecture) it will be hard to write a compiler without understanding data structures normally covered in undergraduate (post secondary) Computer Science classes. This doesn't mean you cannot, but you will need to know essential data structures such as linked lists, and trees.
Rather than writing a full or standards compliant C language compiler (at least in the start), I would suggest limiting yourself to a basic subset of the language, such as common operators, integer only support, and basic functions and pointers. One classic example of this was Ron Cain's Small-C, made popular by a series of articles written in Dr. Dobbs Journal in I believe the 1980s. They publish a CD with the James Hendrix's out-of-print book, A Small-C Compiler.
What I would suggest is following Crenshaw's tutorial, but write it for a C-like language compiler, and whatever CPU target (Crenshaw targets the Motorola 68000 CPU) you wish to target. In order to do this, you will need to know basic assembly of which ever target you want to run the compiled programs on. This could include a emulator for a 68000, or MIPS which are arguably nicer assembly instruction sets than the venerable CISC instruction set of the Intel x86 (16/32-bit).
There are many potential books that can be used as starting points for learning compiler / translator theory (and practice). Read the comp.compilers FAQ, and reviews at various online book sellers. Most introductory books are written as textbooks for sophomore to senior level undergraduate Computer Science classes, so they can be slow reading without a CS background. One older book that might be more introductory, but easier to read than "The Dragon Book" is Introduction to Compiler Construction by Thomas Parsons. It is older, so you should be able to find an used copy from your choice of online book sellers at a reasonable price.
So I'd say, try starting with Jack Crenshaw's Let's Build a Compiler tutorial, write your own, following his examples as a guide, and build the basics of a simple compiler. Once you have that working, you can better decide where you wish to take it from that point.
Added:
In regards to the bootstrapping process. Since there are existing C compilers freely available, you do not need to worry about bootstrapping. Write your compiler with separate, existing tools (GCC, Visual C++ Express, Mingw / djgpp, tcc), and you can worry about self-compiling your project at a much later stage. I was surprised by this part of the question until I realized you were brought to the idea of writing your own compiler by reading Ken Thomas' ACM Turing award speech, Reflections on Trusting Trust, which does go into the compiler bootstrapping process. It's a moderated advanced topic, and is also simply a lot of hassle as well. I find even bootstrapping the GCC C compiler under older Unix systems (Digital OSF/1 on the 64-bit Alpha) that included a C compiler a slow and time consuming, error prone process.
The other sort-of question was what a compiler tool like Yacc actually does. Yacc (Yet Another Compiler Compiler or Bison from GNU) is a tool designed to make writing a compiler (or translator) parser easier. Based on the formal grammar for your target language that you input to yacc, it generates a parser, which is one portion of a compiler's overall design. Next is Lex (or flex from GNU) which used to generate a lexical analyzer or scanner, which is often used in combination with the yacc generated parser to form the skeleton of the front-end of a compiler. These tools make writer a front end arguably easier than writing an lexical analyzer and parser yourself. Crenshaw's tutorial does not use these tools, and you don't need to either, many compiler writers don't always use them. Of course Crenshaw admits the tutorial's parser is quite basic.
Crenshaw's tutorial also skips generating an AST (abstract syntax tree), which simplifies but also limits the tutorial compiler. It lacks most if not all optimization, and is very tied to the specific programming language and the particular assembly language emitted by the "back-end" of the compiler. Normally the AST is a middle piece where some optimization can be performed, and serves to de-couple the compiler front-end and back-end in design. For a beginner without a Computer Science background, I'd suggest not worrying about not having an AST for your first compiler (or at least the first version of it). I think keeping it small and simple will help you finish writing a compiler, in its first version, and you can decide from there how you want to proceed then.
You might be interested in the book/course The Elements of Computing Systems:Building a Modern Computer from First Principles.
Note that this isn't about building a "pc" from stuff you bought off newegg. It begins with a description of Boolean logic fundamentals, and builds a virtual computer from the lowest levels of abstraction to progressively higher levels of abstraction. The course materials are all online, and the book itself is fairly inexpensive from Amazon.
In the course, in addition to "building the hardware", you'll also implement an assembler, virtual machine, compiler, and rudimentary OS, in a step-wise fashion. I think this would give you enough of a background to delve deeper into the subject area with some of the more commonly recommended resources listed in the other answers.
In The Unix Programming Environment, Kernighan and Pike walk through 5 iterations of making a calculator working from simple C based lexical analysis and immediate execution to yacc/lex parsing and code generation for an abstract machine. Because they write so wonderfully I can't suggest smoother introduction. It is certainly smaller than C, but that is likely to your advantage.
How do I [start writing] a simple C compiler?
There's nothing simple about compiling C. The best simple C compiler is lcc by Chris Fraser and David Hanson. They spent 10 years working on the design to make it as simple as they possibly could, while still generating reasonably good code. If you have access to a university library, you should be able to get their book.
Do I start off building a C compiler in C or some other language?
Some other language. One time I got to ask Hanson what lessons he and Fraser had learned by spending 10 years on the lcc project. The main thing Hanson said was
C is a lousy language to write a compiler in.
You're better off using Haskell or some dialect of ML. Both languages offer functions over algebraic data types, which is a perfect match to the problems faced by the compiler writer. If you still want to pursue C, you could start with George Necula's CIL, which is a big chunk of a C compiler written in ML.
I want to read some article written in the same spirit as the one I listed above but that which highlights at least the bootstrapping phases...
You won't find another article like Ken's. But Andrew Appel has written a nice article called Axiomatic Bootstrapping: A Guide for Compiler Hackers I couldn't find a free version but many people have access to the ACM Digital Library.
Any suggestions?
If you want to write a compiler,
Use Haskell or ML as your implementation language.
For your first compiler, pick a very simple language like Oberon or like P0 from Niklaus Wirth's book Algorithms + Data Structures = Programs. Wirth is famous for designing languages that are easy to compile.
You can write a C compiler for your second compiler.
A compiler is a complex subject matter that covers aspects of
Input processing involving Lexing, Parsing
Building a symbol store of every variable used such as an Abstract Syntax Tree (AST)
From the AST tree, transpose and build a machine code binary based on the syntax
This is by no means exhaustive as it is an abstract bird's eye view from the top of a mountain, it boils down to getting the syntax notation correct and ensuring that malformed inputs do not throw it off, in fact a good input processing should never fall on its knees no matter how malformed, terrible, abused cases of input that gets thrown at it. And, also in deciding and knowing what output is going to be, is it in machine code, which would imply you may have to get to know the processor instructions intimately...including memory addressing for variables and so on...
Here are some links for you to get started:
There was a Jack Crenshaw's port of his code for C....(I recall downloading it months ago...)
Here's a link to a similar question here on SO.
Also, here's another small compiler tutorial for Basic to x86 assembler compiler.
Tiny C Compiler
Hendrix's Small C Compiler found here.
It might be worthwhile to learn about functional programming, too. Functional languages are well-suited to writing a compiler both in and for. My school's intro compilers class contained an intro to functional languages and the assignments were all in OCaml.
Funny you should ask this today, since just a couple days ago I wrote a lambda calculus interpreter. Lambda calculus is the granddaddy of all functional languages. It's just 200 lines long (in C++, incl. error reporting, some pretty printing, some unicode) and has a two-phase structure, with an intermediate format that could be used to generate code.
Not only is starting small and building up the most practical approach to compilers, it also encourages good, modular, organizational practice.
A compiler is a very large project, although I suppose it wouldn't hurt to try.
I know of at least one C compiler written in Pascal, so it's not the most insane thing you could do. I personally would pick a more modern language in which to implement my C compiler project, both for the simplicity (it's easy to d/l packages for Python, Ruby, C, C++ or Java) and because it will look better on your resume.
In order to do a compiler as a beginner project, though, you will need to drink all of the Agile kool-aid.
Always have something running, even if it doesn't do much of anything. Add things to your compiler only in small steps. ("Frequent releases".) Pick a viciously tiny subset of the language and implement that first. (Support only i = 0; at first and expand things from there.)
If you want a mind-blowing experience that teaches you how to write compilers that compile themselves, you need to read this paper from 1964.
META II a syntax-oriented compiler writing language by Val Schorre.
In 10 pages, it tells you how to write compilers, how to write meta compilers, provides a virtual metacompiler instruction set, and a sample compiler built with the metacompiler.
I learned how to write compilers from this paper back in the late 60s, and used the ideas to construct C-like langauges for several minicomputers and microprocessors.
If the paper is too much by itself (it isn't!) there's an online tutorial which will walk you through the whole thing.
And if getting the paper from the original link is awkward because you are not an ACM member, you'll find that the tutorial contains all the details anyway. (IMHO, for the price, the paper itself is waaaaay worth it).
10 pages!
I would not recommend starting with C as the language to implement, nor with any of the compiler-generator or parser-generator tools. C is a very tricky language, and it's probably a better idea to just make up a language of your own. It can be a little C-like (e.g. use curly backets if you want to indicate the function body, use the same type names, so you don't have to remember what you called everything).
The tools for making compilers and parsers are great, but have the problem of really being a shorthand notation. If you don't know how to create a compiler in longhand, the shorthand will seem cryptic, needlessly restrictive etc. So write your own simple compiler first, then continue on from there. I also recommend you don't start generating actual machine code unless you eat and breathe assembler. Create your own bytecode interpreter with a VM.
As to what language you should use to create your first compiler: It doesn't really matter, as long as the language is fairly complete. You will be reading input text, building data structures from them and writing out binary data. So if a language makes those things easier in any way, that's a point in favor of it. Pick a language you know well, so you can focus on creating the compiler, not learning the language. I usually use an OO language, which makes the syntax tree easier to write, a functional language would probably also work if you are familiar with that.
I've blogged a lot about programming languages, so you might find some useful postings here: http://orangejuiceliberationfront.com/category/language-design/
In particular, http://orangejuiceliberationfront.com/how-to-write-a-compiler/ is a starter on the particulars of parsing common constructs and generating something useful from that, as well as http://orangejuiceliberationfront.com/generating-machine-code-at-runtime/ which talks about actually spitting out Intel instructions that do something.
Oh, regarding bootstrapping of a compiler: You probably won't be able to do that right from the start. There is a fair amount of work involved in creating a compiler. So not only would writing a bootstrapping compiler involve writing the compiler (in some other language), once you have it, you would then have to write a second version of the compiler using itself. That's twice the work, plus the debugging needed in the existing and the bootstrapped new compiler until it all works. That said, once you have a working compiler, it is a good way to test its completeness. OK, maybe not twice the work, but more work. I'd go for the easy successes first, then move on from there.
In any event, have fun!
Often one of the main reasons given for learning C is that it brings you closer to programming at a low level which gives you insights into how things really work in higher level languages.
I've been programming in various Assembly flavors for awhile and you can't get any lower level than what I have been coding in and I have no illusions about what reference types or pointers really are. It should be pointed out that I also am fairly prolific in Java and C#. I don't only know assembly.
Is it still worth the effort cracking open K&R and spending some valuable time learning the intricacies of the C language or would my time best be served learning something else?
Often one of the main reasons given for learning C is that it brings you closer to programming at a low level
Well, scratch that one off, look at the others and decide for yourself. If you want to:
remain close to low level but gain portability
be able to work with the huge codebase that C has already
just learn something new
then you might want to learn C.
Yes, absolutely. Writing a program of any significant size is very cumbersome in straight assembly, so most applications that are written down-to-the-metal (like hardware drivers) will mostly be in C, or at least C gluing together calls to assembly functions.
Also, because C has such a close relationship with the machine (that is to say, it is low level), learning C after assembly will be a good stepping stone for understanding what a compiler really does to turn high-level code into machine instructions.
Absolutely! Learning C will improve your assembler programming as well. As you learn C you will start to transfer the structured method to your assembler programming. I noticed that the more I learn of high level languages the better the organization and understandability of my assemble language programming.
It is very useful to be able to mix C and assembler. Being able to use both in a single project allows you to use the appropriate solution in any given situation within that project. For most tasks C is quicker to code, occasionally the opposite is true, assembly language is quicker. Sometimes the assembly language is better able to express a particular aspect of a solution (assembler's close mapping to the hardware can make programming I/O or device management clearer). For more abstract concepts C can be clearer (C++ can be better again).
The same goes for learning C++. I find myself using an object oriented approach to both my C and assembler programming.
In the end it's horses for courses. Use the appropriate language for the problem at hand.
You know assembly and you seem to know C#. It's never a bad thing to learn yet another language but I would only recommend learning C if you are going to need it in the near future. I think you would broaden your knowledge more by learning a dynamic language like Ruby or a functional like Common Lisp.
No one has mentioned....
Writing C is quicker to develop....
that C is a route to writing assembler quicker. When I wrote computer games, we wrote everything in C first then re-wrote the parts that took all the time, the old 80-20 rule. 80% of the time is in 20% of the code.
To do this we compiled the code we wished to re-write and used the dump to assembler file flag. Then we took the C generated assembler file and used that as the basis to write more optimised assembler code. This was far quicker than starting from scratch.
Today this approach is far harder as the compilers are far better and it is so much harder for humans to improve the compilers code - since processors got so complicated and fast code has become about keeping the caches and pipelines full.
Writing C is portable between processors....
When writing our games we were able to port 80% of our code between machines with a recompile. More if we used the C versions of the 20% code which we would reimplement in assembler for speed.
Tony
I think the most important reason why you should learn any programming language is so that you can put it to some use.
If you've learnt Assembly to do something, and you feel you can do something else better in C, then go ahead by all means.
But if you find that you've got nothing to do in C, then professionally there's no point in learning it.
If however you want to do it as a hobby or a personal endeavor. Then it's your time, do anything you want.
C is portable (if you write it carefully), that is a good reason for me.
Learning a new language is always a fun thing to do, especially if it's significantly different, paradigm-wise, from what you already know. So I'd say go for it.
I found it very interesting that C has still been one of the most sought-after languages on major search engines and book sites.
Maybe not...but it won't hurt to learn it :) I personally learned x86 assembly before C and my assembly knowledge made it easier for me to grasp C pointers.
It depends. If you want to work with C++, you surely should also learn C. If you want to code in Python or Perl you don't really need it, as you have an understanding for the internals from Assembler.
One thing: Do you worked in Assembler with pointers and the heap? Understanding pointers and memory-management is very important for every higher language. If you didn't get the idea of pointers and the heap right, you should give C a try.
I look at it pragmatically - I wouldn't bother unless you feel like you have jobs where performance is more important than programmer productivity. After 12 years of programming, I've never come across a job that should have been written in C, instead of a garbage collected language. But, your situation may vary.
If all you knew was Java, then I would say yeah, it would be great.
I think it depends - do you (or might you in the future) have to deal with a codebase that includes C? There's an awful lot of it out there, so I'm actually surprised that you haven't already had a need to do something with C (at least reading it) given the assembly, C# and Java experience you cite.
Also, given that you know the above set of languages and the concepts that go along with them, I'd guess that learning C would be a cakewalk for you.
YES it is c is like the core of all programming languages almost every language is build based on it and sense you know c# it wont take you much to learn c
if you have done a lot of assembly stuff then I guess you may someday if not already work with it I don't think there is a programming job that wants you to know only assembly, C is needed even most low level software like Operating systems use C and Windows uses C++ also so in my opinion you should not even think about it C its like a fundamental knowledge even most of the web developers know C
Amazing that in a world of new scripting languages every day there are still people that manage to only know assembler.
I myself, after writing perl or javascript for longer periods always find lower level languages like C or C++ kind of lacking, for instance, where in perl I write foreach(#array), in C/C++ I have to fiddle with for loops and indexes and/or iterators.
So, yes, I can only imagine how much you will get from the abstractions C will provide for you.
Additionally, widening your perspective is always a Good Thing.