Often one of the main reasons given for learning C is that it brings you closer to programming at a low level which gives you insights into how things really work in higher level languages.
I've been programming in various Assembly flavors for awhile and you can't get any lower level than what I have been coding in and I have no illusions about what reference types or pointers really are. It should be pointed out that I also am fairly prolific in Java and C#. I don't only know assembly.
Is it still worth the effort cracking open K&R and spending some valuable time learning the intricacies of the C language or would my time best be served learning something else?
Often one of the main reasons given for learning C is that it brings you closer to programming at a low level
Well, scratch that one off, look at the others and decide for yourself. If you want to:
remain close to low level but gain portability
be able to work with the huge codebase that C has already
just learn something new
then you might want to learn C.
Yes, absolutely. Writing a program of any significant size is very cumbersome in straight assembly, so most applications that are written down-to-the-metal (like hardware drivers) will mostly be in C, or at least C gluing together calls to assembly functions.
Also, because C has such a close relationship with the machine (that is to say, it is low level), learning C after assembly will be a good stepping stone for understanding what a compiler really does to turn high-level code into machine instructions.
Absolutely! Learning C will improve your assembler programming as well. As you learn C you will start to transfer the structured method to your assembler programming. I noticed that the more I learn of high level languages the better the organization and understandability of my assemble language programming.
It is very useful to be able to mix C and assembler. Being able to use both in a single project allows you to use the appropriate solution in any given situation within that project. For most tasks C is quicker to code, occasionally the opposite is true, assembly language is quicker. Sometimes the assembly language is better able to express a particular aspect of a solution (assembler's close mapping to the hardware can make programming I/O or device management clearer). For more abstract concepts C can be clearer (C++ can be better again).
The same goes for learning C++. I find myself using an object oriented approach to both my C and assembler programming.
In the end it's horses for courses. Use the appropriate language for the problem at hand.
You know assembly and you seem to know C#. It's never a bad thing to learn yet another language but I would only recommend learning C if you are going to need it in the near future. I think you would broaden your knowledge more by learning a dynamic language like Ruby or a functional like Common Lisp.
No one has mentioned....
Writing C is quicker to develop....
that C is a route to writing assembler quicker. When I wrote computer games, we wrote everything in C first then re-wrote the parts that took all the time, the old 80-20 rule. 80% of the time is in 20% of the code.
To do this we compiled the code we wished to re-write and used the dump to assembler file flag. Then we took the C generated assembler file and used that as the basis to write more optimised assembler code. This was far quicker than starting from scratch.
Today this approach is far harder as the compilers are far better and it is so much harder for humans to improve the compilers code - since processors got so complicated and fast code has become about keeping the caches and pipelines full.
Writing C is portable between processors....
When writing our games we were able to port 80% of our code between machines with a recompile. More if we used the C versions of the 20% code which we would reimplement in assembler for speed.
Tony
I think the most important reason why you should learn any programming language is so that you can put it to some use.
If you've learnt Assembly to do something, and you feel you can do something else better in C, then go ahead by all means.
But if you find that you've got nothing to do in C, then professionally there's no point in learning it.
If however you want to do it as a hobby or a personal endeavor. Then it's your time, do anything you want.
C is portable (if you write it carefully), that is a good reason for me.
Learning a new language is always a fun thing to do, especially if it's significantly different, paradigm-wise, from what you already know. So I'd say go for it.
I found it very interesting that C has still been one of the most sought-after languages on major search engines and book sites.
Maybe not...but it won't hurt to learn it :) I personally learned x86 assembly before C and my assembly knowledge made it easier for me to grasp C pointers.
It depends. If you want to work with C++, you surely should also learn C. If you want to code in Python or Perl you don't really need it, as you have an understanding for the internals from Assembler.
One thing: Do you worked in Assembler with pointers and the heap? Understanding pointers and memory-management is very important for every higher language. If you didn't get the idea of pointers and the heap right, you should give C a try.
I look at it pragmatically - I wouldn't bother unless you feel like you have jobs where performance is more important than programmer productivity. After 12 years of programming, I've never come across a job that should have been written in C, instead of a garbage collected language. But, your situation may vary.
If all you knew was Java, then I would say yeah, it would be great.
I think it depends - do you (or might you in the future) have to deal with a codebase that includes C? There's an awful lot of it out there, so I'm actually surprised that you haven't already had a need to do something with C (at least reading it) given the assembly, C# and Java experience you cite.
Also, given that you know the above set of languages and the concepts that go along with them, I'd guess that learning C would be a cakewalk for you.
YES it is c is like the core of all programming languages almost every language is build based on it and sense you know c# it wont take you much to learn c
if you have done a lot of assembly stuff then I guess you may someday if not already work with it I don't think there is a programming job that wants you to know only assembly, C is needed even most low level software like Operating systems use C and Windows uses C++ also so in my opinion you should not even think about it C its like a fundamental knowledge even most of the web developers know C
Amazing that in a world of new scripting languages every day there are still people that manage to only know assembler.
I myself, after writing perl or javascript for longer periods always find lower level languages like C or C++ kind of lacking, for instance, where in perl I write foreach(#array), in C/C++ I have to fiddle with for loops and indexes and/or iterators.
So, yes, I can only imagine how much you will get from the abstractions C will provide for you.
Additionally, widening your perspective is always a Good Thing.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
So C obviously has a pretty dominant low level programming stronghold.....but is anything coming out that challenges/wants to replace it?
Python/C#/etc all seem to be aimed at very high level, but when it comes down to nitty-gritty low level stuff C seems to be king and I haven't seen much "try" to replace that?
Is there anything out there, or does learning C for low level stuff seem to be the standard?
If you mean systems level then perhaps D Language.
Whatever happened to Google's GO?
Well to be honest it depends on your need to be "low level"/"system level" and what the system is.
As Neera rightly points out, there is an increasing trend towards managed languages.
So, if you're writing application code, unless you're actually writing the algorithms and optimisations, the idea is that you use the managed code/higher level abstractions. The need to do low level stuff all the time is, on common platforms, vastly reduced. Anywhere you have access to an API that is anywhere near good, you're probably going to have nicer abstraction layers around.
However, if you're implementing on a new architecture, you can either: use assembly to produce a compiler for that platform or write a compiler that outputs machine code from that platform from another platform (cross compilation). Then you need to compile a compiler for that platform.
As you can imagine, C++ is harder to deal with than C in this regard. Even C is actually quite an effort to do well. I've heard people say they like stack based languages like FORTH because for basic work they can get it up and running with very little assembly (compared to a c compiler or full blown cross compilation effort).
Edit (because I like it) Here's a link to the JonesForth git repository. Take a look. Jonesforth is an implementation of forth in i386 assembly complete with code comments walking you through the whole process.
LLVM
C for low level stuff is standard. C works and its known. C is fast because it is low level and makes the programmer do lots of things that Python and C# do for you. You could write another language aimed to replace C, but I don't think it would get you anywhere except a slightly different syntax. (If you wanted to keep the speed of C).
Why is C so fast? Because its shiny assembler. For the things you need to do even faster you use YASM or inline assembler.
There's actually quite a few things that can be used for low level programming. Here's some used in the past with advantages over C.
Pascal variants (used in GEMSOS)
Oberon (used in Oberson System and A2 Bluebottle)
Ada (used in safety critical projects and at least three OS's on limited hardware)
PL/0 (MULTICS)
Modula (used in CVSup and some academic projects for correct system software)
Smalltalk and Haskell were used for prototype OS's or OS replacement layers.
Cyclone, Popcorn, C0, and Typed Assembly Language do better than C while keeping much of it.
Additionally, languages with a runtime can be used if the lowest level parts are implemented by another language. Microsoft's Verve and the JX Operating System are examples. For an old school one, look up the Genera LISP Machine and it's "advantages." You still can't do much of that in modern systems development with C/C++ toolchains. ;)
So, even if C isn't totally replaceable, it's mostly replaceable in most situations without much performance loss. Have fun with these.
The recent trend is moving towards object oriented and managed languages - For example Symbian as an OS is entirely written in C++, Also Microsoft research has come with Singularity OS which is a managed programming model. The idea is that managed languages protect users from easy to make mistakes in C - like resource leaks, pointer corruptions etc by abstracting away these ideas. Also object oriented paradigm helps in writing easy to maintain code. For now C still rule the embedded world, however we can see that changing in coming decade, with more and more embedded world embracing C++ as the language of choice.
I dont thing so
I like to use my old Assembly Rotines, but C is save
I don't think C is low level enough. I would suggest assembly language. As far as I know, it's the lowest level a programmer could go. But you still have to deal with assembler, linker and loaders. There're still many detail things related to the target platform.
There are platform specific low level languages, such as assembly languages and machine codes. With comparing these with C, C is rather high level language.
What do you exactly mean by low level?
C is also used for high level stuff like user interfaces (the whole GNOME Desktop, and its library GTK are written in C).
I'd put C in the low level category because it lets you play with the actual machine (eg: raw memory addresses, just to cite something) adding only a really tiny abstraction layer.
Also other programming languages are offering a clean vision of the underlying machine:
Many are derived from C and are compatible with it (C++, Objective-C). These supply some tools to ease your life by abstracting something. They could replace C, but if you'd use these languages, you'd lose compatibility: ObjectiveC and C++ interfaces cannot be used by C.
Others belong to completely different families, and these, other than the above issue, cannot even use C stuff directly.
Thus, in my opinion, the main reason why C isn't dropped is for commercial reason (it would cost too much to write everything again so that everything is compatible to other languages), pretty much the same reason why COBOL still exists.
There are other reasons, like the fact that C is bare-bone, simple and fast to parse and compile and stuff, but in my opinion these are secondary.
Some big companies who can afford rewriting anything are however trying to kick C off (Apple is extensively using ObjectiveC, for example, while others are using C++).
I think that in future C will keep to exist, since there are no efforts in choosing a specific standard language to be used everywhere in place of C (if you write C code it'll work both with C, with C++ and with ObjectiveC systems, while the opposite is not true) and since there's a too vast code base of C code out there.
I'm a web developer mostly working in Ruby and C#..
I wanna learn a low level language so i dont look like an ass infront of my (computer science expert) boss.
Ive heard a lot of purist buzz about how assembly language is the only way to learn how computers actually work, but on the other hand C would probably be more useful as a language rather than just for theory.
So my question is..
Would Learning C teach me enough computer science theory / low level programming to not look like a common dandy (complete tool)?
Thanks!
Daniel
Thanks guys!
Some really great answers,
I think i'll learn C just to get a grasp of memory management, but i think your right and i'll be better off studying more of the languages i use!
First learn the actual theory. Learning C's syntax means nothing if you can't do anything meaningful with it. After you have a solid grasp of how algorithms and data structures work, this knowledge will be appliable in most languages you'll probably use.
If you're planning to work in C# and Ruby, I don't see the point in learning assembler just for the sake of doing so.
You'll be fine learning just C since it's C.
You can do "real programming" in any language, as long as you do it right.
Would Learning C teach me enough
computer science theory / low level
programming to not look like a common
dandy (complete tool)?
You're using C# which can perform unmanaged pointer manipulation, not too far off what you can achieve in C++ land. For instance, games can be successfully programmed in C#, although most games will be C++.
Assembler is basically moving bits of memory around and sometimes doing a bad job of it too if you don't know what you're doing. C++/C compilers create quite good assembly code underneath, so why would you do it yourself unless you had to write low-level driver software. The only bit of assembler in our 2,000,000 lines of mixed C++/C# code is to detect VMWare.
I don't think in this modern age and given your description of your job there is much call for you to know about how many registers you can use on your processor. By all means learn C++, but as mentioned before, learning syntax is not the same as learning how to apply it.
At a minimum learn and understand patterns (Gang of four), OO design, understand the implications and benefits of different testing methodologies and the impact of all this on business costs.
Learning C will get you all the information you need and be able to accomplish things in a reasonable time.
I suggest you do a few meaningful projects in C, then one or two in an Assembler dialect, maybe for a Microcontroller or specializing in SSE/MMX.
In the end it all comes down to opcodes, registers, and addressing modes, of which C teaches you absolutely nothing. However, assemblers are tightly coupled to their platforms; learning how to write assembler on x86 won't help you much when you work on a Sparc or PA-RISC box.
C really isn't going to teach you much that other languages won't in terms of theory. The thing about C is that it doesn't provide many tools or abstractions beyond naked pointers and byte streams. You have to roll all your own containers (lists, stacks, queues, etc.), and if you want them to be generic, you have to figure out how to separate the code that needs to be type aware (assignments and comparisons) from the general algorithm. You have to manage your own memory. Text processing is a pain in the ass (C doesn't even provide a dedicated string type).
Is that something that would be useful to know for your day-to-day work, or to impress your boss? Maybe. I'm not sure how you would apply that knowledge in your C# or Ruby code, which operates in a virtual machine (as opposed to native binaries generated from C).
You want to impress your boss with Computer Science knowledge, read up on Turing machines and Markov algorithms.
I don't think knowledge of Assembler would make you a better Ruby / C# programmer.
You can go as low level as writing to a disk using a magnetized needle, but if your programs aren't working or are insecure, you wouldn't have gained anything by it.
Learning the syntax of a "new" language won't assist you in gaining more knowledge of the depth of programming.
Real programming is not about a particular low level programming languge. To understand how a computer works it would be not bad to know some CPU instructions and how the hardware processes them.
To become a good programmer it is absolutely not necessary to know how the hardware works, except you want to program hardware.
I wouldn't worry about looking like a fool in front of that boss if that's the criteria your boss has for not being a fool.
C and Assembler isn't the best languages if you want to learn computer science. As you know Ruby and C# I guess you know some object orientation.
"Real programming" works the same in every language and that's what you should really worry about. That being said, working up something in C is a very useful exercise/
Learning C won't necessarily teach you too much about how computers work, but it is a good access door to that, considering that it is still the language of choice for system programming.
Learning ASM is of course useful as well, but it's probably uncalled for if you want to apply it to your job. It might teach you a few useful concepts though, or at least help you get a better understanding of how CLR works (or in general, how bytecode compilation and bytecode-compiled code do their stuff). Maybe you could try learning ASM for an older/simpler platform; there's still a heavy Atari 2600 scene around, and due to the platform's inherent limitations, some of the hacks they do to squeeze some extra functions in a game are quite awesome. The modern x86_64 architecture is pretty complex and somewhat hairy, although that's more of a personal opinion than an actual fact. Learning to use a microcontroller might also get the job done, but be warned that many (most?) use a Harvard architecture (i.e. separate program and data memory) which is different from that of a typical general-purpose CPU.
ask your boss first, than do exactly what he tells you, finally ask again for feedback and review of the exercises you will do. act accordingly. repeat until he says "you now know more than me".
this is the more efficient way of exploiting the expertise of a real expert you have the luck to have, and will leave extremely positive impression forever.
it will be as difficult as learning C or assembler alone in your spare time.
I wanna learn a low level language so i dont look like an ass infront of my (computer science expert) boss.
This is like so true in many aspects, I learned c++, and haskell for similar reasons, I have no regrets since they taught me many things, but I do understand the peer pressure.
This is a 6 years old question, and it is interesting to see that no one mentioned a good textbook on programming that covers the fundamentals of computer science.
If you want to learn "real programming", just try to tackle
"Structure and Interpretation of Computer Programs"
by MIT Press in your free time. It should cover most if not all of your curiosity with respect to programming whatever your level is.
There is also "The Art of Computer Programming" by Donald Knuth. Knuth is especially well versed in machine languages and other low level stuff.
Beware though both of these works are very dense.
Here is my two cents on how to approach them. First of all, they are not "tutorials" that sugar coat your way towards a skill, they introduce a scientific discipline in an appropriate manner, so the mind set of "what am I going to do with this? How can I use this ?" should be replaced by "how does this work ? What does this do ? Why does it do it this way ?".
Try to do the exercises included in the books, but don't push yourself too hard. If you can't do it, it is okay, pass to the next one.
However if you are not doing any of the exercises given in a section, you should take your time with the chapter, and revise it so that you can at least do couple of the exercises included for the section.
Just keep this up, and you would caught up with your boss/peer in no time. Some chapters/exercises might even be a subject of conversation if you feel like it. He might even like the fact that you are acquiring new skills, or not, don't take my word for it, I do programming not to deal with people.
Well, "Real Programming" doesn't refer to a specific language, or low level languages. In my mind, "real Programming" is the process of analyzing a task, deciding the best way to solve it, and putting that method into a set of simple steps, and then revising and editing as necessary until you have achieved your goal. From that, the language doesn't really matter, it's all in the thought process. Furthermore, there is nothing wrong with high level languages, I use python all the time and love it. However, learning low level languages can really help develop your understanding programs and computers in general. would suggest c, because it is where c# and c++ stem from, and would provide a bases for understanding both those languages.
I came across this: Writing a compiler using Turbo Pascal
I am curious if there are any tutorials or references explaining how to go about creating a simple C compiler. I mean, it is enough if it gets me to the level of making it understand arithmetic operations. I became really curious after reading this article by Ken Thompson. The idea of writing something that understands itself seems exciting.
Why did I put up this question instead of asking Google? I tried Google and the Pascal one was the first link. The rest did no seem relevant and added to that... I am not a CS major (so I still need to learn what all those tools like yacc do) and I want to learn this by doing and am hoping people with more experience are always better at these things than Google. I want to read some article written in the same spirit as the one I listed above but that which highlights at least the bootstrapping phases of building a simple C compiler.
Also, I don't know the best way to learn. Do I start off building a C compiler in C or some other language? Do I write a C compiler or some other language? I feel questions like this are better answered once I have some direction to explore. Any suggestions?
Any suggestions?
A compiler consists of three pieces:
A parser
An abstract syntax tree (AST)
An assembly code generator
There are lots of nice parser generators that start with language grammars. Maybe ANTLR would be a good place for you to start. If you want to stick to C roots, try lex/yacc or bison.
There are grammars for C, but I think C in its entirety is complex. You'd do well to start off with a subset of the language and work your way up.
Once you have an AST, you use it to generate the machine code that you'll run.
It's doable, but not trivial.
I'd also check Amazon for books about writing compilers. The Dragon Book is the classic, but there are more modern ones available.
UPDATE: There have been similar questions on Stack overflow, like this one. Check out those resources as well.
I advise you this tutorial:
LLVM tutorial
It is a small example on how to implement a "small language" compiler. The source code is very small and is explained step by step.
There is also the C front end library for the LLVM (Low Level Virtual Machine which represent the internal structure of a program) library:
Clang
For what it's worth, the Tiny C Compiler is a pretty full-featured C compiler in a relatively small source package. You might benefit from studying that source, as it's probably significantly easier to understand than trying to comprehend all of GCC's source base, for instance.
This is my opinion (and conjecture) it will be hard to write a compiler without understanding data structures normally covered in undergraduate (post secondary) Computer Science classes. This doesn't mean you cannot, but you will need to know essential data structures such as linked lists, and trees.
Rather than writing a full or standards compliant C language compiler (at least in the start), I would suggest limiting yourself to a basic subset of the language, such as common operators, integer only support, and basic functions and pointers. One classic example of this was Ron Cain's Small-C, made popular by a series of articles written in Dr. Dobbs Journal in I believe the 1980s. They publish a CD with the James Hendrix's out-of-print book, A Small-C Compiler.
What I would suggest is following Crenshaw's tutorial, but write it for a C-like language compiler, and whatever CPU target (Crenshaw targets the Motorola 68000 CPU) you wish to target. In order to do this, you will need to know basic assembly of which ever target you want to run the compiled programs on. This could include a emulator for a 68000, or MIPS which are arguably nicer assembly instruction sets than the venerable CISC instruction set of the Intel x86 (16/32-bit).
There are many potential books that can be used as starting points for learning compiler / translator theory (and practice). Read the comp.compilers FAQ, and reviews at various online book sellers. Most introductory books are written as textbooks for sophomore to senior level undergraduate Computer Science classes, so they can be slow reading without a CS background. One older book that might be more introductory, but easier to read than "The Dragon Book" is Introduction to Compiler Construction by Thomas Parsons. It is older, so you should be able to find an used copy from your choice of online book sellers at a reasonable price.
So I'd say, try starting with Jack Crenshaw's Let's Build a Compiler tutorial, write your own, following his examples as a guide, and build the basics of a simple compiler. Once you have that working, you can better decide where you wish to take it from that point.
Added:
In regards to the bootstrapping process. Since there are existing C compilers freely available, you do not need to worry about bootstrapping. Write your compiler with separate, existing tools (GCC, Visual C++ Express, Mingw / djgpp, tcc), and you can worry about self-compiling your project at a much later stage. I was surprised by this part of the question until I realized you were brought to the idea of writing your own compiler by reading Ken Thomas' ACM Turing award speech, Reflections on Trusting Trust, which does go into the compiler bootstrapping process. It's a moderated advanced topic, and is also simply a lot of hassle as well. I find even bootstrapping the GCC C compiler under older Unix systems (Digital OSF/1 on the 64-bit Alpha) that included a C compiler a slow and time consuming, error prone process.
The other sort-of question was what a compiler tool like Yacc actually does. Yacc (Yet Another Compiler Compiler or Bison from GNU) is a tool designed to make writing a compiler (or translator) parser easier. Based on the formal grammar for your target language that you input to yacc, it generates a parser, which is one portion of a compiler's overall design. Next is Lex (or flex from GNU) which used to generate a lexical analyzer or scanner, which is often used in combination with the yacc generated parser to form the skeleton of the front-end of a compiler. These tools make writer a front end arguably easier than writing an lexical analyzer and parser yourself. Crenshaw's tutorial does not use these tools, and you don't need to either, many compiler writers don't always use them. Of course Crenshaw admits the tutorial's parser is quite basic.
Crenshaw's tutorial also skips generating an AST (abstract syntax tree), which simplifies but also limits the tutorial compiler. It lacks most if not all optimization, and is very tied to the specific programming language and the particular assembly language emitted by the "back-end" of the compiler. Normally the AST is a middle piece where some optimization can be performed, and serves to de-couple the compiler front-end and back-end in design. For a beginner without a Computer Science background, I'd suggest not worrying about not having an AST for your first compiler (or at least the first version of it). I think keeping it small and simple will help you finish writing a compiler, in its first version, and you can decide from there how you want to proceed then.
You might be interested in the book/course The Elements of Computing Systems:Building a Modern Computer from First Principles.
Note that this isn't about building a "pc" from stuff you bought off newegg. It begins with a description of Boolean logic fundamentals, and builds a virtual computer from the lowest levels of abstraction to progressively higher levels of abstraction. The course materials are all online, and the book itself is fairly inexpensive from Amazon.
In the course, in addition to "building the hardware", you'll also implement an assembler, virtual machine, compiler, and rudimentary OS, in a step-wise fashion. I think this would give you enough of a background to delve deeper into the subject area with some of the more commonly recommended resources listed in the other answers.
In The Unix Programming Environment, Kernighan and Pike walk through 5 iterations of making a calculator working from simple C based lexical analysis and immediate execution to yacc/lex parsing and code generation for an abstract machine. Because they write so wonderfully I can't suggest smoother introduction. It is certainly smaller than C, but that is likely to your advantage.
How do I [start writing] a simple C compiler?
There's nothing simple about compiling C. The best simple C compiler is lcc by Chris Fraser and David Hanson. They spent 10 years working on the design to make it as simple as they possibly could, while still generating reasonably good code. If you have access to a university library, you should be able to get their book.
Do I start off building a C compiler in C or some other language?
Some other language. One time I got to ask Hanson what lessons he and Fraser had learned by spending 10 years on the lcc project. The main thing Hanson said was
C is a lousy language to write a compiler in.
You're better off using Haskell or some dialect of ML. Both languages offer functions over algebraic data types, which is a perfect match to the problems faced by the compiler writer. If you still want to pursue C, you could start with George Necula's CIL, which is a big chunk of a C compiler written in ML.
I want to read some article written in the same spirit as the one I listed above but that which highlights at least the bootstrapping phases...
You won't find another article like Ken's. But Andrew Appel has written a nice article called Axiomatic Bootstrapping: A Guide for Compiler Hackers I couldn't find a free version but many people have access to the ACM Digital Library.
Any suggestions?
If you want to write a compiler,
Use Haskell or ML as your implementation language.
For your first compiler, pick a very simple language like Oberon or like P0 from Niklaus Wirth's book Algorithms + Data Structures = Programs. Wirth is famous for designing languages that are easy to compile.
You can write a C compiler for your second compiler.
A compiler is a complex subject matter that covers aspects of
Input processing involving Lexing, Parsing
Building a symbol store of every variable used such as an Abstract Syntax Tree (AST)
From the AST tree, transpose and build a machine code binary based on the syntax
This is by no means exhaustive as it is an abstract bird's eye view from the top of a mountain, it boils down to getting the syntax notation correct and ensuring that malformed inputs do not throw it off, in fact a good input processing should never fall on its knees no matter how malformed, terrible, abused cases of input that gets thrown at it. And, also in deciding and knowing what output is going to be, is it in machine code, which would imply you may have to get to know the processor instructions intimately...including memory addressing for variables and so on...
Here are some links for you to get started:
There was a Jack Crenshaw's port of his code for C....(I recall downloading it months ago...)
Here's a link to a similar question here on SO.
Also, here's another small compiler tutorial for Basic to x86 assembler compiler.
Tiny C Compiler
Hendrix's Small C Compiler found here.
It might be worthwhile to learn about functional programming, too. Functional languages are well-suited to writing a compiler both in and for. My school's intro compilers class contained an intro to functional languages and the assignments were all in OCaml.
Funny you should ask this today, since just a couple days ago I wrote a lambda calculus interpreter. Lambda calculus is the granddaddy of all functional languages. It's just 200 lines long (in C++, incl. error reporting, some pretty printing, some unicode) and has a two-phase structure, with an intermediate format that could be used to generate code.
Not only is starting small and building up the most practical approach to compilers, it also encourages good, modular, organizational practice.
A compiler is a very large project, although I suppose it wouldn't hurt to try.
I know of at least one C compiler written in Pascal, so it's not the most insane thing you could do. I personally would pick a more modern language in which to implement my C compiler project, both for the simplicity (it's easy to d/l packages for Python, Ruby, C, C++ or Java) and because it will look better on your resume.
In order to do a compiler as a beginner project, though, you will need to drink all of the Agile kool-aid.
Always have something running, even if it doesn't do much of anything. Add things to your compiler only in small steps. ("Frequent releases".) Pick a viciously tiny subset of the language and implement that first. (Support only i = 0; at first and expand things from there.)
If you want a mind-blowing experience that teaches you how to write compilers that compile themselves, you need to read this paper from 1964.
META II a syntax-oriented compiler writing language by Val Schorre.
In 10 pages, it tells you how to write compilers, how to write meta compilers, provides a virtual metacompiler instruction set, and a sample compiler built with the metacompiler.
I learned how to write compilers from this paper back in the late 60s, and used the ideas to construct C-like langauges for several minicomputers and microprocessors.
If the paper is too much by itself (it isn't!) there's an online tutorial which will walk you through the whole thing.
And if getting the paper from the original link is awkward because you are not an ACM member, you'll find that the tutorial contains all the details anyway. (IMHO, for the price, the paper itself is waaaaay worth it).
10 pages!
I would not recommend starting with C as the language to implement, nor with any of the compiler-generator or parser-generator tools. C is a very tricky language, and it's probably a better idea to just make up a language of your own. It can be a little C-like (e.g. use curly backets if you want to indicate the function body, use the same type names, so you don't have to remember what you called everything).
The tools for making compilers and parsers are great, but have the problem of really being a shorthand notation. If you don't know how to create a compiler in longhand, the shorthand will seem cryptic, needlessly restrictive etc. So write your own simple compiler first, then continue on from there. I also recommend you don't start generating actual machine code unless you eat and breathe assembler. Create your own bytecode interpreter with a VM.
As to what language you should use to create your first compiler: It doesn't really matter, as long as the language is fairly complete. You will be reading input text, building data structures from them and writing out binary data. So if a language makes those things easier in any way, that's a point in favor of it. Pick a language you know well, so you can focus on creating the compiler, not learning the language. I usually use an OO language, which makes the syntax tree easier to write, a functional language would probably also work if you are familiar with that.
I've blogged a lot about programming languages, so you might find some useful postings here: http://orangejuiceliberationfront.com/category/language-design/
In particular, http://orangejuiceliberationfront.com/how-to-write-a-compiler/ is a starter on the particulars of parsing common constructs and generating something useful from that, as well as http://orangejuiceliberationfront.com/generating-machine-code-at-runtime/ which talks about actually spitting out Intel instructions that do something.
Oh, regarding bootstrapping of a compiler: You probably won't be able to do that right from the start. There is a fair amount of work involved in creating a compiler. So not only would writing a bootstrapping compiler involve writing the compiler (in some other language), once you have it, you would then have to write a second version of the compiler using itself. That's twice the work, plus the debugging needed in the existing and the bootstrapped new compiler until it all works. That said, once you have a working compiler, it is a good way to test its completeness. OK, maybe not twice the work, but more work. I'd go for the easy successes first, then move on from there.
In any event, have fun!
I'm in the process of teaching myself Objective-C via Stephen Kochan's "Programing In Objective-C 2.0" It's a great book and currently reading it for the second time, first time I got the gist of it and now it's really starting to sink in the second time.
If you have a moderate (about) knowledge of Objective-C, how difficult would it be to learn C? I realize there is seemingly endless debate on which language to learn first. I decided to go with Objective-C because I was interested in Cocoa Mac Apps/iPhone apps.
Side Note: For those familiar with the Chipmunk Physics engine... The reason I may start pursuing C eventually is that it uses C. How much C would I need to know to adequately use it. I was going to use it along with Cocos2d which uses Objective-C
Given that C is a strict subset of Objective-C, if you truly fully know Objective-C already, you know C as well.
No. I learned C and I came from C#.
However it is really hard to find updated tutorials/blogs, here are a few that I used:
Blogs (only one I could find that's updated):
Hard To C
Tutorials:
About C Programming Tutorials
C Programming
The C Tutorial
C Pointer Tutorial
C and ObjC have a lot of overlap, but their patterns are very different. Memory management in particular is radically different. Much of how you attack problems is very different. ObjC is all about relying on the framework and fitting into the framework and not getting in the way of the framework. In C, you're the bottom layer; libraries rely on you most of the time, not the other way around.
That said, if your goal is to write ObjC programs that incorporate C libraries, then learning ObjC first is definitely the right approach and Kochan's book is a great start (followed by Cocoa Programming for Mac OS X by Hillegass). Using an engine like Chipmunk or cocos2d is going to take care of some of harder details of C programming for you and definitely help ease you into learning your way around.
Basically, C is Objective-C without:
Anything involving message sending (all those [] brackets
Anything that starts with an # (the Objective C guys choose to use that in front of everything they added to make it clear what was an extension
Additionally there are a few things that are part of Objective-C but you might never actually use if you learned purely Objective-C. These are actually useful to understand since sometimes they are the best choice even in Objective-C code, and you might bump into them when you interface with other people's code. Things like:
Function declaration syntax (no methods)
Function pointers
structs
malloc()/free()
Over all I would think it is probably easier to move from Objective-C to C than it is to start learning C from scratch. That is just a guess though, I learned C first (around ~15 years ago) and have been writing Objective C code for close to a decade.
Well, I would suggest getting some experience with C before delving into Objective-C. Why? Because Objective-C insulates you from a lot of the more complicated and interesting programming bits, like very manual memory management, crazy pointers, &c. Because Objective-C is a strict superset of C, there may come a time when you need to use these concepts, but without any C experience, it can be pretty confusing. I made the mistake of not spending enough time on C before jumping to Objective-C, so when I started working on more complex applications, I needed to do a lot of reading on C.
I learned C before I learned Objective-C, but the two have their similarities and differences.
All the if statements, for & while loops are all the same in C, but how you GTD is different. Your not really exposed to the whole pointer thing in Objective-C, but that is super important in C, and the only way to get that down is reading and practice. It may seem daunting, but once you get the hang of it, its not that bad. Functions are a bit different then methods syntactically and how their called, so you will have to learn that also, but if you get a good book (http://www.amazon.com/Programming-Language-Prentice-Hall-Software/dp/0131103628) it shouldn't be that hard. Just read up and practice!
I also highly recommend the Stanford iTunes-U programming paradigms class, helps a lot learning about pointers, after you have a basic knowledge! Being a young programmer, it definitely helped me get a good grasp of it all.
Pointers are what make C difficult, and it sounds as though they aren't a big part of programming Objective-C.
Chapter 13 "Underlying C Language Features" in Stephen Kochan's Obj-C book, appears to talk a lot about pointers, structures, functions, ect... with C. I never read it the first time around (it sorta suggested only read it if you need to) but after I read it the 2nd time I might go back and read Chapter 13, it should give me a good idea of C compared to Obj-C.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
Joel and company expound on the virtues of learning C and how the best way to learn a language is to actually write programs using that use it. To that effect, which types of applications are most suitable to the C programming language?
Edit:
I am looking for what C is good at. This likely does not coincide with the best way of learning C.
Code where you need absolute control over memory management. Code where you need to be utterly in control of speed versus memory trade-offs. Very low-level file manipulation (such as access to the raw devices).
Examples include OS kernel, and embedded apps.
In the late 1980s, I was head of the maintenance team on a C system that was more than a million lines of code. It did database access (Oracle), X Windows graphics, interprocess communications, all sorts of good stuff. It ran on VMS and several varieties of Unix. But if I were to recreate that system today, I sure wouldn't use C, I'd use Java. Others would probably use C#.
Low level functions such as OS kernel and drivers. For those, C is unbeatable.
You can use C to write anything. It is a very general purpose language. After doing so for a little while you will understand why there are other "higher level" languages available.
"Learn C", by all means, but don't don't stick with it for too long. Other languages are far more productive.
I think the gist of the people who say you need to learn C is so that you understand the relationship between high level code and the machine internals and what exaclty happens with bits, bytes, program execution, etc.
Learn that, and then move on.
Those 100 lines of python code that were accounting for 80% of your execution time.
Small apps that don't have a UI, especially when you're trying to learn.
Edit: After thinking a little more on this, I'd add the following: if you already know a higher-level language and you're trying to learn more about C, a good route may be to not create a whole new C app, but instead create a C DLL and add functions to it that you can call from the higher language. In this way you can replace simple functions that your high language already has to ensure that you C function does what it should (gives you pre-built testing), lets you code mostly in what you're familiar with, uses the language in a problem you're already familiar with, and teaches you about interop.
Anything where you think of using assembly.
Number crunching (for example, libraries to be used at a higher level from some other language like Python).
Embedded systems programming.
A lot of people are saying OS kernel and device drivers which are, of course, good applications for C. But C is also useful for writing any performance critical applications that need to use every bit of performance the hardware is capable of.
I'm thinking of applications like database management systems (mySQL, Oracle, SQL Server), web servers (apache, IIS), or even we browsers (look at the way chrome was written).
You can do so many optimizations in C that are just impossible in languages that run in virtual machines like Java or .NET. For instance, databases and servers support many simultaneous users and need to scale very well. A database may need to share data structures between multiple users (threads/processes), but do so in a way that efficiently uses CPU caches. In C, you can use an operating system call to determine the size of the cache, and then align a data structure appropriately to the cache line so that the line does not "ping pong" between caches when multiple threads access adjacent, but unrelated data (so called "false sharing). This is one example. There are many others.
A bootloader. Some assembly also required, which is actually very nice..
Where you feel the need for 100% control over your program.
This is often the case in lower layer OS stuff like device drivers,
or real embedded devices based on MCU:s etc etc (all this and other is already mentioned above)
But please note that C is a mature language that has been around for many years
and will be around for many more years,
it has many really good debugging tools and still a huge number off developers that use it.
(It probably has lost a lot to more trendy languages, but it is still huge)
All its strengths and weaknesses are well know, the language will probably not change any more.
So there are not much room for surprises...
This also means that it would probably be a good choice if you have a application with a long expected life cycle.
/Johan
Anything where you need a minimum of "magic" and need the computer to do exactly what you tell it to, no more and no less. Anything where you don't trust the "magic" of garbage collection to handle your memory because it might not be as efficient as what you can hand-code. Anything where you don't trust the "magic" of built-in arrays, strings, etc. not to waste too much space. Anything where you want to be able to reason about exactly what ASM instructions the compiler will emit for a given piece of code.
In other words, not too much in the real world. Most things would benefit more from higher level abstraction than from this kind of control. However, OS code, device drivers, and a few things that have to be near optimal in both space and speed might make sense to write in C. Higher level languages can do pretty well competing with C on speed, but not necessarily on space.
Embedded stuff, where memory-usage and cpu-speed matters.
The interrupt handler part of an OS (and maybe two or three more functions in it).
Even if some of you will now start to bash heavily on me now:
I dont think that any decent app should be written in C - it is way too error prone.
(and yes, I do know what I am talking about, having written an awful lot of code in C myself (OSes, compilers, VMs, network protocols, RT-control stuff etc.).
Using a high level language makes you so much more productive. Speed is typically gained by keeping the 10-90 rule in mind: 90% of CPU time is spent in 10% of your code (which you should optimize first).
Also, using the right algorithm might give more performance than optimizing bits in C. And doing things right in C is so much more trouble.
PS: I do really mean what I wrote in the second sentence above; you can write a complete high performance OS in a high level language like Lisp or Smalltalk, with only a few minor parts in C. Think how the 70's Lisp machines would fly on todays hardware...
Garbage collectors!
Also, simply programs whose primary job is to make operating-system calls. For example, I need to write a short C program called timeout that
Takes a command line as argument, with a number of seconds for that command to run
Forks two child processes, one to run the command and one to sleep for N seconds
When the first of the child processes exits, kills the other, then exits
The effect will be to run a command with a limit on wall-clock time.
I and others on this forum have tried several different solutions using shells and/or perl. All are convoluted and none quite do the right thing. In C the solution will be easy, because all the OS facilities are right where you can get at them.
A few kinds that I can think of:
Systems programming that directly uses Unix/Linux or Win32 system calls
Performance-critical code that doesn't have much string manipulation in it (e.g., number crunching)
Embedded programming or other applications that are severely resource-constrained
Basically, C gives you portable, efficient access to the native capabilities of the machine; everything else is your responsibility. In particular, string manipulation in C is tedious, error-prone, and generally nasty; the most effective way to do extensive string operations with C may be to use it to implement a language that handles strings better...
examples are: embedded apps, kernel code, drivers, raw sockets.
However if you find yourself more productive in C then go ahead and build whatever you wish. Use the language as a tool to get your problem solved.
c compiler
Researches in maths and physics. There are probably two alternatives: C and C++, but such features of the latter as encapsulation and inheritance are not very useful there. One could prefer to use C++ "as a better C" or just stay with C.
Well most people are suggesting system programming related things like OS Kernels , Device Drivers etc. These are difficult and Time consuming. Maybe the most fun thing to with C is console programming. Have you heard of the HAM SDK? It is a complete software development kit for the Nintendo GBA , and making games for it is fun. There is also the CC65 Compiler which supports NES Programming (Althought Not Completely). You can also make good Emulators. Trust Me , C is pretty helpful. I was originally a Python fan, and hated C because it was complex. But after yuoget used to it, you can do anything with C. Now I use CPython to embed Python in my C Programs(if needed) and code mostly in C.
C is also great for portability , There is a C Compiler for like every OS and Almost Every Console And Mobile Device. Ive even seen one that supports some calculators!
Well, if you want to learn C and have some fun at the same time, might I suggest obtaining NXC and a Lego Mindstorms set? NXC is a C compiler for the Lego Mindstorms.
Another advantage of this approach is that you can compare the effort to program the Mindstorms "brick" with C and then try LEJOS and see how it goes with Java.
All great fun.
Implicit in your question is the assumption that a 'high-level' language like Python or Perl (or Java or ...) is fast enough, small enough, ... enough for most applications. This is of course true for most apps and some choice X of language. Given that, your language of choice almost certainly has a foreign function interface (FFI). Since you're looking to learn C, create a module in the FFI built in C.
For example, let's assume that your tool of choice is Python. Reimplement a subset of Numpy in C. Since C is a pretty fast language, and has, in C99, a clear numeric library interface, you'll get the opportunity to experience the power of C in an appropriate setting.
ISAPI filters for Internet Information Server.
Before actually write C code, i would suggest first read good C code.
Choose subject you want to concentrate on, basically any application can be written in C, but i assume GUI application will be not your first choice, and find few open source projects to look into.
Not any open source project is best code to look. I assume that after you will select a subject there is a place for another question, ask for best open source project in the field.
Play with it, understand how it's working modify some functionality...
Best way to learn is learn from somebody good.
Photoshop plugin filters. Lots and lots of interesting graphical manipulation you can do with those and you'll need pure C to do it in.
For instance that's a gateway to fractal generation, fourier transforms, fuzzy algorithms etc etc. Really anything that can manipulate image/color data and give interesting results
Don't treat C as a beefed up assembler. I've done some serious app's in it when it was the natural language (e.g., the target machine was a Solaris box with X11).
Write something with some meat on it. Write a client server chess program, where the AI is on a server and the UI is displaying in X11; once you've done that you will really know C.
I wonder why nobody stated the obvious:
Web applications.
Any place where the underlying libraries are entirely in C is a good candidate for staying in C - openGL, Lua extensions, PHP extensions, old-school windows.h, etc.
I prefer C for anything like parsing, code generation - anything that doesn't need a lot of data structure (read OOP). It's library footprint is very light, because class libraries are non-existent. I realize I'm unusual in this, but just as lots of things are "considered harmful", I try to have/use as little data structure as possible.
Following on from what someone else said. C seems a good language to implement the language in which you write the rest of your software.
And (mutatis mutandis) the virtual machine which runs the rest of your software.
I'd say that with the minuscule runtime environment and it's self-contained nature, you might start by creating some CLI utilities such as grep or tail (or half the commands in Unix). Anything that uses only STDOUT, STDIN and file manipulation is a good candidate.
This isn't exactly answering your question because I wouldn't actually CHOOSE to use C in such an app, but I hope it's answering the question you meant to ask--"what would be a good type of app to use learn C on?"
C isn't actually that bad a language--it's pretty easily to understand your code at an assembly language level which is quite useful, and the language constructs are few, leaving a very sparse language.
To answer your actual question, the very best use of C is the one for which it was created--porting itself (and UNIX) to other CPU architectures. This explains the lack of language features very well; it also explains the existence of Pointers which are really just a construct to make the compiler work less--any code created with pointers could be created without it (just as well optimized), but it becomes much harder to create a compiler to do so.
digital signal processing, all Pure Data extensions are written in C, this can be done in other languages also but has had good success in C