Related
I often write codes in MATLAB/Python to test whether my algorithm is feasible (& actually works). I then need to convert the entire code into C and sometimes, in FORTRAN90.
What would be a good way to manually convert a medium sized code from one language to another?
I have tried :
Converting the entire code from one into another and then testing it.
(Sometimes, there are errors and bugs which just won't go away and the finding the source of the error becomes a problem)
Go line by line and check for consistency of outputs every few lines.
(Too time consuming)
Use converters like f2c.
(In my experience, they are extremely horrible. I link to a lot of libraries which have different function calls for C and Fortran)
Also,:
I am fairly conversant with the programming languages I deal with so I don't need manuals or reference guides for my work (i.e. I know the syntax).
I am not asking this question specifically about MATLAB and C but rather as a translation paradigm.
Regarding the size, the codes are less than 100 lines long.
I dont want to call the code of one language to another. Please don't suggest that.
Different languages call for different paradigms. You definitely don't write and design code the same way in eg. Matlab, Python, C# or C++. Even object hierarchies will change a lot depending on the language.
That said, if your code consists in a few interconnected procedures, then you may go away with a direct line by line translation (every language allow you to write two or three interconnected functions while remaining idiomatic). But this is the case only for the simplest programs.
Prototyping in a high level language and then implementing the same idea in a robust and clean way in a "production" language is a very good practice, but involves two very different things :
Prototype in whatever language you want. Test, experiment, and convince yourself that the idea works. Pay attention to the big picture, don't focus on performance but on the high level ideas. Pay also attention to difficulties that you encounter when implementing, as you'll face them again in step 2.
Implement from scratch the idea in the production environment in language X. It will be quicker than if you did not do the prototyping stage, since most of the difficulties have been met in stage 1. Use idiomatic X, and focus on correctness. Pay attention to corner cases, general robustness, and once it works correctly, performance. You'll notice that roughly half of your code is made of new things which did not appear in 1. (eg. error checking, corner case handling, input/output, unit testing, etc).
You can see that line by line translation is obviously not a good idea, since you don't translate into the same program.
Also, when not prototyping, I find myself throwing away the first version and making another one that I like better, ie. I find myself prototyping ! Implementing the same thing twice is not a loss of time, it is normal development flow.
You may want to consider using a higher level domain specific language with multiple backends (e.g., Matlab, C, Fortran), producing clean and idiomatic code for each target language, probably with some optimisations. If your problem domain is narrow and every piece of code is more or less typical, it should be fairly trivial to design and implement such a DSL.
Break the source down into psuedo-code with input/process/output and then write your new code base to fit that spec.
I am currently playing around with programming languages. I have spent some time writing parsers and interpreters in high level languages (most notably Haxe).
I have had some results, that I think are actually quite nice, but now I'd like to make them fast.
My idea was to translate the input language into C.
My C knowledge is limitted to what you learn at university. Beyond some exercises, I have never written actual C programs. But I feel confident I can make it work.
Of course I could try to write a frontend for the LLVM or to generate MSIL or JVM bytecode. But I feel that's too much to learn right now, and I don't see much of a gain actually.
Also C is perfectly human readable, so if I screw up, it's much easier to understand why. And C is, after all, high level. I can really translate concepts from the input language without too much mind-bending. I should be having something working up and running in a reasonable amount of time and then optimize it as I see fit.
So: Are there any downsides to using C? Can you recommend an alternative?
Thank you for your insight :)
Edit: Some Clarification
The reason why I want to go all the way down is, that I am writing a language with OOP support and I want to actually implement my method dispatching by hand, because I have something very specific in mind.
A primary area of use would be writing HTTP services, but I could image adding bindings to a GUI library (wxWidgets maybe) or whatever.
C is a good and quite popular choice for what you're trying to do.
Still, take a look at LLVM's intermediate language (IR). It's pretty readable and I think it's cleaner and easier to generate and parse than C. LLVM comes with quite a big collection of tools to work with it. You can generate native code for variety of platforms (as with C but with slightly more control over output) or for virtual machines. Possibility of JIT compilation is also a plus.
See The Architecture of Open Source Applications, Chapter 11 for introduction to LLVM approach and some snippets of IR.
What is your target environment? This might help us give you better answer.
C is actually a pretty good choice for a target language for a little or experimental compiler -- its widely available on many platforms, so your compiler becomes immediately useful in many environments. The main drawback is dealing with things that are not well supported in C, or are not well defined in the C spec. For example, if you want to do dynamic code generation (JIT compilation), C is problematic. Things like stack unwinding and reflection are tricky to do in C (though setjmp/longjmp and careful use of structs for which you generate layout descriptions can do a lot). Things like word sizes, big or little-endian layout, and arithmetic precision vary between C compilers, so you have to be aware of that, but those are things you need to deal with if you want to support multiple target machines anyways.
Other languages can be used as well -- the main advantage of C is its ubiquity.
You might consider C--, a C-like language intended to be a better target for code generation than C.
C is a good choice, IMHO. Unlike many languages, C is generally considered "elegant" in that you have only 32 keywords, and very basic constructs (sequence, selection, iteration), with a very simple-and-consistent collection of tokens and operators.
Because syntax is very consistent within C (brackets and braces, blocks and statements, use of expressions), you're not marching into an unbounded world of language expansion. C is a mature language, has weathered time nicely, and now-a-days is a "known quantity" (which is really hard to say about many other languages, even "mature" ones).
I know about the existance of question such as this one and this one. Let me explain.
Afet reading Joel's article Back to Basics and seeing many similar questions on SO, I've begun to wonder what are specific examples of situations where knowing stuff like C can make you a better high level programmer.
What I want to know is if there are many examples of this. Many times, the answer to this question is something like "Knowing C gives you a better feel of what's happening under the covers" or "You need a solid foundation for your program", and these answers don't have much meaning. I want to understand the different specific ways in which you will benefit from knowing low level concepts,
Joel gave a couple of examples: Binary databases vs XML, and strings. But two examples don't really justify learning C and/or Assembly. So my question is this: What specific examples are there of knowing C making you a better high level programmer?
My experience with teaching students and working with people who only studied high-level languages is that they tend to think at a certain high level of abstraction, and they assume that "everything comes for free". They can become very competent programmers, but eventually they have to deal with some code that has performance issues and then it comes to bite them.
When you work a lot with C, you do think about memory allocation. You often think about memory layout (and cache locality if that's an issue). You understand how and why certain graphics operations just cost a lot. How efficient or inefficient certain socket behaviors are. How buffers work, etc. I feel that using the abstractions in a higher level language when you do know how it is implemented below the covers sometimes gives you "that extra secret sauce" when thinking about performance.
For example, Java has a garbage collector and you can't directly assign things to memory directly. And yet, you can make certain design choices (e.g., with custom data structures) that affect performance because of the same reasons this would be an issue in C.
Also, and more generally, I feel that it is important for a power programmer to not only know big-O notation (which most schools teach), but that in real-life applications the constant is also important (which schools try to ignore). My anecdotal experience is that people with skills in both language levels tend to have a better understanding of the constant, perhaps because of what I described above.
In addition, many higher level systems that I have seen interface with lower level libraries and infrastructures. For instance, some communications, databases or graphics libraries. Some drivers for certain devices, etc. If you are a power programmer, you may eventially have to venture out there and it helps to at least have an idea of what is going on.
Knowing low level stuff can help a lot.
To become a racing driver, you have to learn and understand the basic physics of how tyres grip the road. Anyone can learn to drive pretty fast, but you need a good understanding of the "low level" stuff (forces and friction, racing lines, fine throttle and brake control, etc) to get those last few percent of performance that will allow you to win the race.
For example, if you understand how the CPU architecture works in your computer, you can write code that works better with it (e.g. if you know you have a certain CPU cache size or a certain number of bytes in each CPU cache line, you can arrange your data structures and the way that you access them to make the best use of the cache - for example, processing many elements of an array in order is often faster than processing random elements, due to the CPU cache). If you have a multi-core computer, then understanding how low level techniques like threading work can gave huge benefits (just as not understanding the low level can lead to disaster in threading).
If you understand how Disk I/O and caching works, you can modify file operations to work well with it (e.g. if you read from one file and write to another, working on large batches of data in RAM can help reduce I/O contention between the reading and writing phases of your code, and vastly improve throughput)
If you understand how virtual functions work, you can design high-level code that uses virtual functions well. If used incorrectly they can severely hamper performance.
If you understand how drawing is handled, you can use clever tricks to improve drawing speed. e.g. You can draw a chessboard by alternately drawing 64 white and black squares. But it is often faster to draw 32 white sqares and then 32 black ones (because you only have to change the drawing colour twice instead of 64 times). But you can actually draw the whole board black, then XOR 4 stripes across the board and 4 stripes down the board in white, and this can be much faster still (2 colour changes, and only 9 rectangles to draw instead of 64). This chessboard trick teaches you a very important programming skill: Lateral thinking. By designing your algorithm well, you can often make a big difference to how well your program operates.
Understanding C, or for that matter, any low level programming language, gives you an opportunity to understand things like memory usage (i.e. why is it a bad thing to create several million heavy objects), how pointers/object references work, etc.
The problem is that as we've created ever increasing levels of abstraction, we find ourselves doing a lot of 'lego block' programming, without understanding how the legos actually function. And by having almost infinite resources, we start treating memory and resources like water, and tend to solve problems by throwing more iron at the situation.
While not limited to C, there's a tremendous benefit to working at a low level with much smaller, memory constrained systems like the Arduino or old-school 8-bit processors. It lets you experience close to the metal coding in a much more approachable package, and after spending time squeezing apps into 512K, you will find yourself applying these skills at a larger level within your day to day programming.
So the language itself is not important, but having a deeper appreciation for how all of the bits come together, and how to work effectively at a level closer to the hardware is a set of skills beneficial to any software developer.
For one, knowing C helps you understand how memory works in the OS and in other high level languages. When your C# or Java program balloons on memory usage, understanding that references (which are basically just pointers) take memory too, and understand how many of the data structures are implemented (which you get from making your own in C) helps you understand that your dictionary is reserving huge amounts of memory that aren't actually used.
For another, knowing C can help you understand how to make use of lower level operating system features. You don't need this often, but sometimes you may need memory mapped files, or to use marshalling in C#, and C will greatly help understand what you're doing when that happens.
I think C has also helped my understanding of network protocols, but I can't put my finger on specific examples. I was reading another SO question the other day where someone was complaining about how C's bit-fields are 'basically useless' and I was thinking how elegantly C bit fields represent low-level network protocols. High level languages dealing with structures of bits always end up a mess!
In general, the more you know, the better programmer you will be.
However, sometimes knowing another language, such as C, can make you do the wrong thing, because there might be an assumption that is not true in a higher-level language (such as Python, or PHP). For example, one might assume that finding the length of a list might be O(N) where N is the length of the list. However, this is probably not the case in many high-level language instances. In Python, for most list-like things the cost is O(1).
Knowing more about the specifics of a language will help, but knowing more in general might lead one to make incorrect assumptions.
Just "knowing" C would not make you better.
But, if you understand the whole thing, how native binaries work, how does CPU work with it, what are architecture limitations, you may write a code which is easier for CPU.
For example, how L1/L2 caches affect your work, and how should you write your code to have more hits in L1/L2 caches. When working with C/C++ and doing heavy optimizations, you will have to go down to that kind of things.
It isn't so much knowing C as it is that C is closer to the bare metal than many other languages. You need to be more aware of how to allocate/deallocate memory because you have to do it yourself. Doing it yourself helps you understand the implications of many decisions that you make.
To me any language is acceptable as long as you understand how the compiler/interpreter (basically) maps your code onto the machine. It's a bit easier to do in a language that exposes this directly, but you should be able to, with a bit of reading, figure out how memory is allocated and organized, what sort of indexing patterns are more optimal than others, what constructs are more efficient for particular applications, etc.
More important, I think, is a good understanding of operating systems, memory architectures, and algorithms. If you understand how your algorithm works, why it would be better to choose one algorithm or data structure over another (e.g., HashSet vs. List), and how your code maps onto the machine, it shouldn't matter what language you are using.
This is my experience of how I learnt and taught myself programming, specifically, understanding C, this is going back to early 1990's so may be a bit antique, but the passion and the drive is important:
Learn to understand the low level principles of the computer, such as EGA/VGA programming, here's a link to the Simtel archive on the C programmer's guide to the PC.
Understanding how TSR's work
Download the whole archive of Bob Stout's snippets which is a big collection of C code that does one thing only - study them and understand it, not alone that, the collection of snippets strives to be portable.
Browse at the International Obfuscated C Code Contest (IOCCC) online, and see how the C code can be abused and understand the intracies of the language. The worst code abuse is the winner! Download the archives and study them.
Like myself, I loved the infamous Ponzo's C Tutorial which helped me immensely, unfortunately, the archive is very hard to find. If anyone knows of where to obtain them, please leave a comment and I will amend this answer to include the link. There is another one that I can remember - Coronado's [Generic?] C Tutorial, again, my memory on this one is hazy...
Look at Dr. Dobb's journal and C User Journal here - I do not know if you can still get them in print but they were a classic, can remember the feeling of holding a printed copy in my hand and tearing off home to type in the code to see what happens!
Grab an ancient copy of Turbo C v2 which I believe you can get from borland.com and just play with 16bit C programming to get a feel and mess with the pointers...sure it is ancient and old but playing with pointers on it is fine.
Understand and learn Pointers, link here to the legacy Simtel.net - a crucial link to achieving C Guru'ship for want of a better word, also you will find a host of downloads pertaining to the C programming language - I remember actually ordering the Simtel CD Archive and looking for the C stuff...
A couple of things that you have to deal directly with in C that other languages abstract away from you include explicit memory management (malloc) and dealing directly with pointers.
My girlfriend is one semester from graduating MIT (where they mainly use Java, Scheme, and Python) with a Computer Science degree, and she is currently working at a company whose codebase is in C++. For the first few days she had a difficult time understanding all the pointers/references/etc.
On the other hand, I found moving from C++ to Java very easy, because I was never confused about pass-references-by-value vs pass-by-reference.
Similarly, in C/C++ it is much more apparent that primitives are just the compiler treating the same sets of bits in different ways, as opposed to a language like Python or Ruby where everything is an object with its own distinct properties.
A simple (not entirely realistic) example to illustrate some of the advice above. Consider the seemingly harmless
while(true)
for(Iterator iter = foo.iterator(); iter.hasNext();)
bar.doSomething( iter.next() )
or the even higher level
while(true)
for(Baz b: foo)
bar.doSomething(b)
A possible problem here is that each time round the while loop a new object (the iterator) is created. If all you care about is programmer convenience, then the latter is definitely better. But if the loop has to be efficient or the machine is resource constrained then you are pretty much at the mercy of the designers of your high level language.
For example, a typical complaint for doing high-performance Java is having execution stop while garbage (such as all those allocated Iterator objects) is reclaimed. Not very good if your software is charged with tracking incoming missiles, auto-piloting a passenger jet, or just not leaving the user wondering why the GUI has stopped responding.
One possible solution (still in the higher-level language) would be to weaken the convenience of the iterator to something like
Iterator iter = new Iterator();
while(true)
for(foo.initAlreadyAllocatedIterator(iter); iter.hasNext();)
bar.doSomething(iter.next())
But this would only make sense if you had some idea about memory allocation...otherwise it just looks like a nasty API. Convenience always costs somewhere, and knowing lower-level stuff can help you identify and mitigate those costs.
I'm a web coder: I currently enjoy AS3 and deal with PHP. I own a Nintendo DS and want to give C a go.
From a higher level, what basic things/creature comforts are going to go missing?
I can't find [for... in] loops, so I assume they aren't there. It looks like I'm going to have to declare things religiously, and I assume I have no objects (which I dealt with in PHP a while ago).
Hash tables? Funny data types?
To sum it up, you'll basically get:
Typed variables
Functions
Pointers
Standard libraries
Then, you make the rest -- that may be a little too simplified, but that's a rough idea of what to face.
It can be daunting to begin with and there may be a learning curve to overcome. Here's a few speed bumps you may encounter:
String? What string?
One big thing to get used to would be strings. There is no such thing as a string in C. A string is a "null-terminated character array" (sometimes called C strings), which basically means an array of type char with the final element being a \0 (char value 0).
In memory, a char array of length 4 containing Hi! would appear as:
char[0] == 'H'
char[1] == 'i'
char[2] == '!'
char[3] == '\0'
Also, strings don't know their own length (no such things as "objects" that come for free in C), so the use of standard library call strlen would be required, which more or less is a for loop that goes through the string until it hits a \0 character. (This means it's an O(N) operation -- longer the string, longer it takes to find the length, unlike O(1) operation of most string implementation in modern languages.)
Garbage collection?
No such thing is as a garbage collector in C. In fact, you need to allocate and deallocate memory yourself:
/* Allocate enough memory for array of 10 int values. */
int* array_of_ints = malloc(sizeof(int) * 10);
/* Done with the array? Don't forget to free the memory! */
free(array_of_ints);
Failing to clean up after allocation of memory can lead to things called memory leaks which I'm sure you've heard of before.
Pointers!
And as always, when we talk about C, we can't forget about pointers. The whole concept of references to variables and dereferencing pointers can be a serious headache-inducing concept, but once you get a hang of it, it's actually not too bad.
Except for the times when you expect it to work one way, but you find out that you didn't quite understand pointers well enough and it actually does something else -- as they say, been there, done that.
Oh, and pointers are probably going to be one of the first times you'll actually see a program crash bad enough that the operating system will yell at you. A segmentation fault is not something the computer likes a lot.
Types
All variables in C will have types. C is a statically-typed language, meaning that variable types will be checked at compile time. This might take some getting used to at the beginning, but can also be seen as a good thing, as it can reduce runtime errors such as type errors where you try to assign a number to a string.
However, it is possible to perform typecasts, so it is possible to cast a int type (which are integer values) to a double type (a floating type value). However, it is not possible to try to cast an int directly to a string like char*.
So, for example, in some languages the following is allowed:
// Example of a very weakly-typed pseudolanguage with implicit typecasts:
number n = 42
string s = "answer: "
string result = s + n // Result: "answer: 42"
In C, one would have to call an itoa function to get a char* representation of an int, then use strcat to concatenate two strings.
Conclusion
Those things said, learning C coming from a higher language can be very eye-opening and probably challenging to begin with, but once you get a hang of it, it can be pretty fun to work with.
I'd recommend starting to experiment with a C compiler, and have a good book or reference.
I think many people will recommend the K&R book, which is indeed an excellent book.
At first, I didn't think recommending K&R as the first C book would be a good idea because it may be a little bit on the difficult side, but on second thought, I think it is a very comprehensive and well-written book that can be good for getting into C if you already have some programming experience.
Good luck!
Well ... You might be in for something of a culture shock. These are the 32 standard keywords in C, and that includes the basic types.
C's standard library is pretty functional (more so than people perhaps expect), but very very thin when compared to what higher-level languages give you. There is no hash table in sight, and you are correct to assume that C does not have syntactic or semantic support for objects.
It is possible to write pretty object-oriented code anyway, but you will have to jump through a few hoops, and do much more manually since the language won't help you. See for instance the GTK+ UI toolkit for an example of a well-designed object-oriented C library/API.
I'm a web coder: I currently enjoy AS3 and deal with PHP. I own a Nintendo DS and want to give C a go.
Why do you want to do C programming?
What are your reasons, what do you hope to achieve?
Is it in order to write software for the Nintendo DS?
From a higher level, what basic things/creature comforts are going to go missing?
Given your background, I think you'll personally miss the lack of dynamic typing support, in other words you will have to be very explicit in your C programs, your data must be specified with proper types, so that the compiler knows what type of data you are working with. This also applies to any sort of memory management, i.e. basically anything once you start working with data structures that are non PODs.
For example, where you would do something like this in php:
function multiply(x) {
return (x*x);
}
You would have to do something like this in C:
int multiply(int x) {
return (x*x);
}
While these may seem fairly similar, there are big differences, namely typing restrictions: the php version will also work with floating point values, while in C you would have to explicitly provide versions for different types and ranges of values (C types are constrained to certain ranges).
I can't find [for... in] loops, so assume they aren't there
in C, it looks more like the following:
int c;
for (c=0;c<=10;c++) {
// loop body
}
it looks like I'm going to have to declare things religiously
Yes, very much so - much more so, than you'll appreciate
and I assume I have no objects (which I dealt with in PHP a while ago).
correct, no objects - but OOP can still be emulated using other ways, such as function(struct obj)
Depending on your goals and motivation, I think you may find C a pretty frustrating language to start serious programming with, you may want to look into some of the related alternatives like for example Java instead.
Dynamic arrays and garbage collection. It's not built in to C so you'll need to roll your own or use a pre-existing solution.
The standard procedure is that you manage the memory yourself which might sound like something horrible but it really isn't. For example in AS3 and PHP you can create an array and forget it when you're done with it. In C you'll have to make sure to deallocate it yourself or memory will leak and bad stuff can/will happen.
You'll particularly miss automatic memory management, and semantically meaningful datatypes such as strings, tables &c. However, learning C well is quite instructive, even though you probably don't want to use it for application-level programming, so I suggest you grab a "K&R" (Kernighan and Ritchie's seminal book) and give it a go -- you'll find plenty of free libraries on the web to use and study as you proceed beyond that, though you'll have to discipline yourself to use proper memory management heuristics... happy learning!
I was just doing some research online, and it seems there's a viable possibility to use lua for developing on the "nintendo DS", this may in fact be the easiest way for someone familiar with high level languages to get started doing embedded development, without sacrificing too much HLL power and without experiencing the inevitable culture shock when migrating from a HLL to C: microlua, here are the API docs.
So you might want to give it a go, possibly using an emulator for starters.
Keep us posted!
I'm pretty sure you want to be looking at C++, not C. C++ is basically object oriented C.
What you'll REALLY miss is the ability to rapidly prototype and test changes. You can't just change a line of code and run. Even using build tools like "make" a recompile can often take several minutes. This is even worse when you consider that it's really easy to make mistakes in C/C++. On large projects I reckon I spend more time compiling than actually coding. As a long-term user of script languages this is my biggest issue with using C.
Moving directly from a higher-level language running on a machine with effectively infinite resources to a DS is going to be a challenge, and not just because of the language.
The Nintendo DS has only 4MB of RAM, a 66MHz ARM-7, no operating system, and the development libraries available (such as libnds) provide only a thin abstraction over the hardware itself.
So, in addition to having to deal with manual memory management, a simpler language with fewer creature comforts, static typing, lack of objects, and the need to run a compile step before you can see any changes, you also have to deal with memory fragmentation, a very slow CPU by modern standards, and needing to interact with the hardware directly in order to do anything useful.
Writing code for the DS, the only other option is C++. You can't use a lot of the advanced features that make C++ worthwhile on such a limited system. You'd be writing C code using a C++ compiler.
That said, it's a lot of fun. You can screw around with the hardware all you like, and there's no need to interface with the operating system, because there isn't one.
C is the next level above straight assembler and allows you to operate close to the metal. This gives power to do amazing stuff but also to easily shoot yourself in the foot!
One such example is direct memory access and the perils and wonder of pointer arithmetic. Pointers are very powerful, fast, and handy however require careful management. See this SO question for an example.
Also as mentioned by the other answerers you will have to do your own memory management. Again powerful and painful.
I would recommend studying up a good textbook and find some quality example code. The key thing is to learn the patterns that make all this stuff hang together correctly and elegantly (well, as much as possible). A good debugger will also really help and get familiar with the standard C libraries too.
You may notice your applications crashing at the drop of a hat initially but perservere as C is definitely worth at least dabbling in. You will understand some of the amazing abstractions higher level languages provide and what is really going on under the hood.
We need more homebrew developers. I am a GBA/NDS and many other embedded platform developer and hope to see that you continue with this. I would say skip to arm assembler and then back up to C or any other language you like, once you know how the processor works, languages are just syntax.
I assume your prior experience covers the programming mindset, breaking things down into bite sized chunks and then writing code to perform those chunks. Then another module that links those together and so on. Then C is just another language, a very very simple language, no need to dive into the corners of it, drive down the middle. It is a good habit to declare variables, etc, and here you will have to. The compilers will tell you when you have forgotten something. You are not going to need big concepts, big structures, language magic, this is embedded, you are resource limited, write some bytes here, read a register there, extract a bit from the data to see if a button has been pressed, write a register in response to move a sprite, etc.
I think you will find the NDS much harder than C at first, there are two processors and some infrastructure to get the simplest of working binaries. Granted there are many many examples out there as well. I generally (and still do) recommend starting with the GBA then graduate to the NDS. bite size chunks.
A lot of things from OOP is the same or almost the same in PHP and C#.
You don't play with pointers in C# (compared to C++) so I would definitely recommend going with C# if you want to play with C.
What C are you talking about?
C#
foreach(string item in itemsCollection)
{
...
}
PHP
foreach($itemsCollection as $key=>$value)
{
...
}
etc.
I like C# because it is strongly typed and your types are automatically checked while you write a code... The possibility of trying to save integer into string or vice versa is zero compared to PHP where you can save anything into anything...
In the Stack Overflow podcasts, Joel Spolsky constantly harps on Jeff Atwood about Jeff not knowing how to write code in C. His statement is that "knowing C helps you write better code." He also always uses some sort of story involving string manipulation and how knowing C would allow you to write more efficient string routines in a different language.
As someone who knows a little C, but loves to write code in perl and other high-level languages, I have never once come across a problem that I was able to solve by writing C.
I am looking for examples of real-world situations where knowing C would be useful while writing a project in a high-level/dynamic language like perl or python.
Edit: Reading some of the answers you guys have submitted have been great, but still doesn't make any sense to me in this regard:
Take the strcat example. There's a right way and a wrong way to combine strings in C. But why should I (as a high-level developer) think that I am smarter than Larry Wall? Why wouldn't the language designers write the string manipulation code the right way?
The classic example that Joel Spolsky uses is on misuse of strcat and strlen, and spotting "Shlemiel the painter" algorithms in general.
It's not that you need C to solve problems that higher-level languages can't solve, it's that knowing C well gives you a perspective on what's going on underneath all those levels of languages that allows you to write better software. Because just such a perspective helps you avoid writing code which is, unknown to you, actually O(n^2), for example.
Edit: Some clarification based on comments.
Knowing C is not a prerequisite for such knowledge, there are many ways to acquire the same knowledge.
Knowing C is also not a guarantee of these skills. You may be proficient in C and yet still write horrible, grotty, kludgy code in every other language you touch.
C is a low-level language, yet it still has modern control structures and functions so you aren't always getting caught up in the fiddly details. It's very difficult to become proficient at C without gaining a mastery of certain fundamentals (such as the details of memory management and pointers), mastery of which often pays rich dividends when working in any language.
It's always about the fundamentals.
This is true in many pursuits as well as software engineering. It is not secret incantations that make the best programmers the best, rather it is a greater mastery of the fundamentals. Experience has shown that knowledge of C tends to have a higher correlation to mastery of certain of those fundamentals, and that learning C tends to be one of the easier and more common routes to acquiring such knowledge.
It's a mistake to assume that learning C will somehow automatically give you a better understanding of low-level programming concerns. In a lot of cases even C is too high level to give you a good understanding of efficiency concerns.
A classic is i++ versus ++i. It's over-cited, so perhaps most people know the implications about performance between these two operations. But learning C wouldn't magically teach you this by itself.
I guess I understand arguments about strings. When string operations are made deceptively simple, people often use them in inefficient ways. But again, knowing that strncat exists doesn't give you a full appreciation for the efficiency concerns. A lot of C programmers probably haven't even thought about the fact that strncat has to do a strlen operation internally.
Even using C, it's important to understand what's going on behind the scenes if efficiency is a concern. People who know C tend to view things in a progression. Assembly and machine code are the building blocks of C, while C is a building block of higher level languages.
This isn't specifically true, but it's obvious that C is "closer to the metal" than many higher level languages. This has at least two effects: efficiency concerns aren't as hidden behind implicit behavior, and it's easier to screw up.
So you want a specific example of how knowing C gives you an advantage. I don't think there is one. I think what people mean when they say this is that knowing what's going on behind the scenes in whatever language you're happening to write for helps you make more intelligent decisions about how to write code. However, it's a mistake to assume that C is "what's going on behind the scenes" in Java, for instance.
It's hard to quantify exactly, but having an understanding of C will give your more insight into how higher-level language constructs are implemented, and as a consequence you'll be better able to use the constructs in an intelligent manner.
To give you a specific reason: having to write my own Garbage Collection routines has helped my write better code.
I don't think I have ever found a problem that I haven't been able to solve with a higher-level language; but started by learning C, it has instilled in me quite a number of excellent development practices. Knowing how the rudimentary parts of the flow of an application work will enable to you be able to look at your own code and get a good visual of how the data flows, and where it is stored. This then leads to a better understand of how to track down leaking memory, slow disk reads, poorly constructed caches, etc.
Keeping track of Pointers... that's another one that comes to mind.
Classic examples are things involving lower level memory management, such as the implementation of a linked list class:
struct Node
{
Data *data;
Node *next;
}
Understanding how the pointers are used to iterate the list, and what they signify in terms of the machine architecture will allow you to better understand your high level code.
Another example which Joel was referring to was the implementation of string concatenation, and the right way to create a string from a set of data.
// this is efficient
for (int i=0; i< n; i++)
{
strcat(str, data(i));
}
// this could be too, but you'd need to look at the implementation to be sure
std::string str;
for (int i=0; i<n; i++)
{
str+=data(i);
}
Knowing C helps you to write better code in C. I guess that the example of Joel Spolsky is of little use in C++ or Objective-C where specific classes for manipulating strings exist and have been crafted with performance in mind. Moreover, using C tricks in other languages may be couter productive.
Nevertheless, C knowledge is very helpful to understand general concepts in other languages and what is behind the hood in many situations.
As someone who knows a little C, but loves to write code in perl and other high-level languages, I have never once come across a problem that I was able to solve by writing C.
I am looking for examples of real-world situations where knowing C would be useful while writing a project in a high-level/dynamic language like perl or python.
It's easy to start writing high level code and then wonder we it's running slow. The truth is there are many ways to write perl or python code, and some are better (as in more efficient) than the others. If you know the low level details of how your code is executed in perl or python (both of which are written in C) you can code around several inefficiencies --like knowing which looping construct is faster, how memory is retained/released, etc.
Also, when writing a project in perl or python you sometimes hit a performance wall. The creators of the language (Guido, at least) advocate that you implement that part in C, as a language extension. To do that, well, you'll have to know C.
So, there.
For the purposes of argument, suppose you wanted to concatenate the string representations of all the integers from 1 to n (e.g. n = 5 would produce the string "12345"). Here's how one might do that naïvely in, say, Java.
String result = "";
for (int i = 1; i <= n; i++) {
result = result + Integer.toString(i);
}
If you were to rewrite that code segment (which is quite good-looking in Java) in C as literally as possible, you would get something to make most C programmers cringe in fear:
char *result = malloc(1);
*result = '\0';
for (int i = 1; i <= n; i++) {
char *intStr = malloc(11);
itoa(i, intStr, 10);
char *tempStr = malloc(/* some large size */);
strcpy(tempStr, result);
strcat(tempStr, intStr);
free(result);
free(intStr);
result = tempStr;
}
Because strings in Java are immutable, Integer.toString creates a dummy string and string concatenation creates a new string instance instead of altering the old one. That's not easy to see from just looking at the Java code. Knowing how said code translates into C is one way of learning exactly how inefficient said code is.
Do you use arrays much ? and do you come across situations where you need items to be stored in memory without knowing how many of them (i.e. based on a query from the database?) then I suppose C would teach you great things like stacks, structs and link lists which might help you. Regards, Andy
Knowing C is really not worth much. Many of us who know C deeply like to think that all that deep insight is valuable and important.
Some of us who know C can't think of a single specific feature of C that's helpful to know about.
Knowing how pointers work in C (especially with C's syntax) isn't all that helpful. In a high-level language your statements create objects and manage their interaction. Pointers and references are -- perhaps -- interesting from a hypothetical point of view. But the knowledge has no practical impact on how you use Java or Python.
The higher-level languages are the way they are. Knowing how doesn't change those languages; it doesn't change how you use them, debug or test them.
Knowing how to create or manipulate a linked list has no earthly impact on Python list class definition. None.
Knowing the difference between Linked List and Array List might help you write a Java program. But the C implementation doesn't help you choose between Linked List and Array List. The decision is independent of knowing C.
A bad algorithm is bad in every language. Knowing inner mysteries of C doesn't make a bad algorithm any less bad. Knowing C doesn't help you know the Java collections or the Python built-in types.
I can't see any value in learning C. Learning Fortran is just as valuable.
Technically, all of the deficiencies of C would force you to code around them; making you write more code -> making you more experienced in general. Lacking any portable integer bigger than 32-bits, for example, C has, in the past, made me write my own bignum library.
The lack of implicit memory, resource and error management (garbage collection, RAII, automatically-called constructors/destructors, maybe exceptions) force C users to write a lot of initialization, error-handling and cleanup code. It may just be me, but I'm never tired of writing such code. I go and read the documentation of every external function I call, return to my code and check for every return value and other failure-indicative stuff. It even makes me feel safe!
This last point is probably the biggest one to be made in favor of the argument. You can only write so many malloc()/free() pairs before you start to analyze the lifetime of every single variable you come across in every single language! C++'s automatic-storage objects don't help this disorder, either.
Writing truly portable C code often requires the programmer to be free of a lot assumptions about the host system - think sizeof(), CHAR___BITS, unsigned long, UINT_MAX. While this hasn't helped me write better code in other languages, it has helped me think about possible alternate implementations: how a tiny microprocessor could still run my C code, generating a gazillion RISC instructions for my simple one-line statement. (That is another thing; not many other languages map to and from a given assembly language so easily in my head. Then again, that may just be me.)
Of course, none of these arguments go only for C. #S.Lott has a valid point - Fortran might be an equally good alternative. But there is so much C code around! A whole personal computer system from top to bottom -applications to libraries to drivers to kernel- is available in source code in C. It would be such a waste if you could not read it.
I think it is worth knowing some low-level language, and there are pragmatic reasons to choose C:
It's low-level, close to assembler
It's widespread
Understanding the whole stack is valuable. Sometimes you need to debug something's guts. Sometimes you cannot fix a performance problem without low-level knowledge (this is often not the case, e.g., when the performance problem is purely algorithmic, but sometimes it is).
Why is C widely considered the quintessential "bottom of the stack", and not some other language(s)? I think this because C is a low-level programming language, and C won. It has been a while now, but C was not always as dominant. To take just one famous example, the proponents of Common Lisp (which had its own ways of writing low-level code) were hoping their language would be popular, too, and eventually lost.
The following are usually implemented in C:
operating systems (Unix variants, Windows, many embedded operating systems)
higher-level programming languages (many popular implementations of Java, Python, etc)
(obviously) reams of popular open source projects
I'm not a hardware person, but I gather that C has influenced CPU design heavily, too.
So if you believe in understanding the whole stack, learning C is, from a pragmatic perspective, the best choice.
As a caveat, I think it's worth learning assembler, as well. Although C is close to the metal, I didn't fully understand C until I had to do some assembler. It is occasionally helpful to understand how functions calls are actually performed, how for loops are implemented, etc. Less important, but also useful, is having to (at least once) deal with a system without virtual memory. When using C on Windows, Unix, and certain other operating systems, even humble malloc does a lot of work under the covers that is easier to appreciate, debug and/or tune if you've ever had to deal with manually locking and unlocking memory regions (not that I would recommend doing so on a regular basis!)
I see it like this , everything boils down to C in a crossplatform level, and assembly in a platform specific way. So it's like being a crosscountry Rally racer, and C is basic automotive mechanics, you can be a great driver but when you get into trouble knowing C means you can probably get yourself back in the race, if not you're stuck calling the mechanics. And assembly is what the mechanics and manufacturers know, it's a worthy investment if that's what you want to do, otherwise you can just trust the mechanics.
For specifics think about memory management, hardwar drivers, physics engines, high performance 3d graphics, TCP stacks, binary protocols, embedded software, creating high level languages like Perl
You cannot write an OS kernel in Perl; C would be a much better choice for that, because it is low-level enough to express everything the kernel should do, and portable enough to let you port your kernel to different architectures
Knowing C is not a requirement to being able to effectively use higher-level languages, but it certainly can help ones general understanding of how computers and software work - I think it's similar to an assertion that knowing some assembly language or computer architecture/hardware logic (and/or/nand gates, etc) can help a C programmer be a better programmer.
Sometimes in order to solve a problem it helps to know how things are working 'underneath' what you're doing.
I don't think this means a programmer must know C in order to be a good programmer, but I think that knowing C can be helpful to almost any programmer.
Not knowing Perl well, I am wondering if it is now possible to distribute processor load to more than one physical core with several threads created in a single program in Perl, without spawning additional processes
I don't think there can be any specific example.
What learning C does for you is give you an insight, a broadening of the mind, into how computers (and software) work. It's a very abstract thing ..
It doesn't make you write better code in python, it just makes you more of a computer scientist.
The reference that Wedge made to Joel's article mentioning Shlemiel the painter is an interesting one but has no relevance here. That algorithm is not tied to C in any particular way (although it manifests itself in null-terminated strings).
Python's strings are immutable anyway, and completely different from C's model of strings, so I don't quite see the relationship.
I suppose one concrete example is optimizing a parser or a lexer or a program that keeps writing to a string buffer all the time. If you use normal strings instead of a string buffer, you'll run across a problem when you build very large strings.
Consider that:
a = a + b
makes a copy of both a and b. It doesn't change the string that was referenced by a, it creates a new string, allocating more memory, etc.
If a becomes considerably large, and you keep adding small things to it, then Shlemiel the painter will manifest himself.
But then again, knowing this has nothing to do with knowing C, just knowing how your language implements things at the low level. (This is where having an experiece in C will help you).
In Python, say you have a function
def foo(l=[])
l.append("bar")
return l;
On some version of Python, available about a year ago, running foo() for times, you'd get a really interesting result (i.e. ["bar","bar","bar","bar]).
It seems that someone implemented the default parameters as a static variable (and without resetting it), so unexpected results happen.
Perhaps my example was contrived - a friend of mine who actually likes Python found this peculiar bug, but the fact of the matter is all of these languages are implemented in C or C++. Not knowing and not understanding concepts that are fundamental to the base language means that you won't have an in-depth understanding of languages that are built on top of that.
I find all the "why bother with C/C++/ASM question silly". If you're inclined enough to learn a language, that means that you're curious enough to get into it the first place. Why stop at just before C?
Knowing C is great because it does nothing behind your back (GC, bounds checking, etc.). It only does exactly what you tell it too. Nothing is implied. Even C++ does things you don't tell it too with RAII (of course, it is implied that the object is destructed when it goes out of scope, but you don't actually write that). C is a great way to learn what goes on 'under the hood' of the computer, without having to write assembly.
inefficient code (eg loops of string+=) are typically inefficient in any language. what difference does it make if someone explains why it is inefficient in one language or the other? knowing C, but not realizing that a method is inefficient, is no different than knowing python and not realizing the same.