As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm a C# programmer and I'm sold on the benefits of learning C. I want to deepen my knowledge of the underlying OS and CPU, understand the pain of memory management that garbage collection encapsulates away and generally improve my high-level programs thanks to an appreciation of the low-level issues that the compiler is dealing with on my behalf.
My question is how long can I expect to spend learning the C language in order to gain these benefits?
Is a couple of weekends spent reading the K&R book from cover to cover sufficient, or do I need to schedule time to cut some code? Do I need to spend time delving into any libraries, or is an understanding of the first-order concepts in the language enough to improve my C# code?
To be clear, I don't intend to write any significant programs in C. My goal is more to learn from the language than to become an expert in the language.
C will take a week to learn, and a lifetime to master.
Reading a K&R book and not writing code is like reading a book on weapons and never actually shooting. Yes, you've read in a book, that it works this way, but you have never encountered the typical problems that arise while doing this. Without practice such "knowlegde" is worth very little.
Plan to spend 2-3 years slowly writing small programs for solving different tasks in C. This will count as real experince. C provides delayed gratification for your effort.
I'm not sure how long it takes to learn a language - it probably comes down to the individual. But I'm pretty confident you can't learn one without writing and debugging code in it.
Ten Years
If you can read K&R and understand it all, that's pretty good, as K&R covers pretty much all of the language.
However, reading it and understanding it all are very different. You should probably take a few passes through K&R and do all the associated exercises to ensure you really know it.
Even after reading through all of that, you will spend more months learning pointers the hard way. Expect lots of seg faults. On the plus side though, you'll get really good at reading hex!
There are a few caveats that the language has that you'll find out as well. One that used to give me trouble is that all pointers are the same size (4 bytes on x86), regardless of what they point at. A char* is the same size as a void* and an int*.
It will take a lot lot longer if you just sit around asking abstract questions and not actually diving in and doing it. Do you have a deadline or something? How long will it take me to learn the piano? Who cares, I just wanna make some noise. That's how kids learn so fast. They don't care about becoming an expert, or even good. They just like to play.
In any case, if you want to learn some interesting things, try some assembler as well. A lot of people really hate it, but that's just because they don't like spending countless hours not accomplishing much. I like it just fine.
You definitely need to write some code - I don't believe you can learn any language without doing that. K&R has lots of exercises you can practice on. It's difficult to know how long in terms of elapsed time it will take to get a good working knowledge - I used to teach pretty much the whole language in 4.5 days, but that is quite intensive. I'd suggest about a month, if you are doing an hour or so a day.
Edit: I must admit, I find it a bit depressing that so many people think C is so difficult. K&R is 272 pages long, in my copy, and covers basically everything you need to know, including the standard library. Is there book in ANY other programming language that covers the whole shebang so concisely? I don't think so, and the reason is not that K&R is compressed in some way (Brian Kernighan is THE greatest techical writer, IMHO) but that the language is simple and easy to describe.
I read the K&R book cover to cover and would not say I have any great understanding of C. Some time doing the exercises in K&R would be hugely beneficial.
I'm sure C libraries would make you more productive writing programs, but if it is simply learning C you are interested in, then you can implement anything yourself that you need. www.projecteuler.net is a good source of problems (although slightly mathematical in general) for you to get started on, if you fancy trying some coding outside of the K^R exercises.
In a couple of weekends, you will obtain mainly two results:
hello world
a lot of segmentation faults
C is not easy, in particular if you are not used to its hardcore concept. You will have to invest weeks, even months in tinkering with it, to grasp the most obscure (but still not too much) essence.
40 days and 40 nights.
If you can't do the days and nights sequentially, then it will be 42 weekends.
But seriously, without putting any context on how fast you learn other topics, nobody can give you a real answer that is relevant to you. We can say how long it took us to learn it to a satisfying level, but that has zero correlation to how long it should take you to learn it.
If you said it took you 6 months to be good at C#, then maybe we can say it should take you 6 months * X (where X is still a guess, but a better guess than now).
We can all agree, however, that just reading the book is not enough. Of course you will have to write code. That is how we best learn anything - read it, write it, teach it. If you really want to learn something, teach it.
To understand the pain of memory management just being writing sample programs with stacks, linked lists, binary trees, etc. You'll see what you're getting into.
In school i was taught C as the introductory language and as pointers got introduced a whole slew of individuals dropped the class because frankly it's a hard concept to grasp.
As many of the other answers have stated... Plan to not only read but practice. There's no doubt that you haven't learned alot from C# by just making mistakes while coding and having 'aha!' moments.
IMO: 3 to 4 years to really understand the majority of concepts. A book will help you realize what the capabilities of the language are.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
What is the best place or a link to learn algorithms in C? How do you know when and where to use the implementation of algorithms by just looking into the problems?
Algorithms aren't necessarily tied to a specific language, just to clarify, so any algorithms book will work great as long as you can understand the concept being the data structure/algorithm.
That said, this seems like a good choice: Algorithms in C. I have the C++ equivalent on my shelf.
There is also a book that seems language agnostic (correct me if I'm wrong) called Data Structures & Algorithm's, though I hear it's a bit dated, so you'll miss out on more recent structures.
Don't forget the internet has a plethora of information available to you. However, books are usually better for these sorts of things. This is because internet resources tend to focus on one thing at a time. For example, you need to understand what Big-O notation is before you can understand what it means when we say a List has O(1) [constant time] removal.
A book will cover these things in the correct order, but an internet resource will focus on either Big-O notation or data structures, but often won't easily connect the two.
When it comes to using it, you'll mostly make the connection when it comes to what you'll be doing with the data.
For example, you might want a vector (array) if you just need ordered elements, but if you need ordered elements and removal from any place (but can sacrifice random access), then a list would be more appropriate, due to it's constant-time removal.
For a reasonable (though far from perfect) book on implementing commonly used algorithms in C, try Sedgewick's Algorithms in C. Note that as for any technical subject,a paper book is likely to be far superior to any Web resources.
As to how to know when to use a specific algorithm, I'm afraid that is down to experience.
For an algortihms text, Cormen, Leiserson and Rivest's 'Introduction to Algorithms' is a good start. The pseudocode implementations are easy to translate to C. Two web resources with many links to documentation about algorithms and sample implementations are:
Stony Brook Algorithm Repository
NIST Directory of Data Structures and Algorithms
Algorithms in C by Sedgewick is a great place to start the investigation. Once you are familiar with what algorithms are available and what the performance characteristics of each are, you'll be able to see where to use each of them.
This is my collection of mostly math-related algorithms:
List of algorithms
FXT (math related)
Numerical Methods
Numerical Recipes in C
How do u know when and where to use
the implementation of algorithms by
just looking into the problems
It's called "pattern matching", once you've seen and solved lots of problems you start to recognize common things and you can reuse your previous knowledge.
By the way, I would recommend you before a good book just on algorithms before starting with algorithms in C, which are more difficult to implement and more error prone than in higher level language, and once you are very confident with the general procedures you can start to tweak and optimize them in C.
Many good resources have already been named, so I won't repeat them here.
As for how do you know what algorithm to use when?
You need to have a big enough tool box, which you will obtain by sitting down and slogging through a long list of basic (and them more esoteric) data structures and algorithms. You should try to get all the basics, but really only need a sample from the more specialized ones.
You need to understand what trade offs are available to you (time, code complexity, memory, single versus multiple passes, in-place versus copy, stable versus unstable sorts, etc. ad nauseum), and how the algorithms you study do on each of these. Again, this is just a case of much studying. Big-O is a place to start, but is not the end all and be all of this.
You need to get a feel for understanding what are the real limits you face when presented with a problem, and how to express these in terms of the algorithm trade offs mentioned above. This requires a degree of intuition, and is generally learned by practice over time.
It is worth implementing some things more then one way as you go along, to learn in your gut, what works and what doesn't.
It is worth reading code written by folks more experienced than yourself, to see how they think.
Good luck.
The Wikipedia List of Algorithms is also very handy reference.
And, if you want to get deeper -- The Art of Computer Programming (wikipedia ref).
Preferably after the Robert Sedgewick book already referred in multiple answers.
I read Pointers on C by Kenneth Reek recently. I thought I was pretty well versed in C, but this book gave me a few epiphanies, despite being aimed at beginners. The code examples are things of beauty (but not the fastest code on a x86-like CPU). It provide good implementations of many of the most common algorithms and data-structures that are in use, with excellent explanations about why they are implemented as they are (and sometimes code or suggestions for alternative implementations).
On the same page as your question: patterns for creating reusable code in C (that is what we all want, isn't it?), C Interfaces and Implementations: Techniques for Creating Reusable Software, by David R. Hanson. It has been a few years since I read it, and I don't have a copy to verify what I recall is correct, but if I remember correctly it deals with how to create good C API:s to data structures and algorithms, as well as giving example implementations of some of the most common algorithms.
Of topic: As I have mostly written throw-away programs in C for private use, this one helped me get rid of some bad coding habits as well as being an excellent C reference: C: A reference Manual. Reminds me that I ought to buy that one.
One needs experience to know which set of algorithms to use for a particular problem. Defining a goal will help. Speed, memory, robustness, solution quality ... are all factors in determining which algorithms to use. We could devise different solutions to the same problem given different set of factors and scenarios.
The Algorithm Design Manual is worth a look.
A easy method to learn algorithms is to use Wiki page, who is dedicated to some "classical" algorithms like search algorithms or for sort. The constructions of algorithms is based on ability to use different data structures, like linked lists or C. So, first try to implement different data structures like simple linked list or binary tree, and after try to use in different algorithms who is related to real life problems.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 months ago.
Improve this question
What is the use and importance of studying theory of computation?
I had course on the same subject during graduation, but I did not study it seriously.
I also found the following link where some video lectures are available:
Theory of Computation
Shai Simonson's classes are really very good. I have listened to them. As he says in the initial lecture, 'Theory of Computation' is a study of abstract concepts. But these abstract concepts are really very important to better understanding of the field of computing, as most of the concepts we deal with have lot of abstract and logical under pinnings.
As John Saunders said, you can become a programmer, even a good one if you know the programming language well. But the knowledge of what is going underneath will always makes you an enlightened one. So go ahead and learn it again (NB: I understand why you didn't study it seriously at college. Most of the teachers in our colleges aren't that good at explaining this topic (I too had a lousy teacher), but I assure you the teacher here is the best you can get.
I think every computer science student should know some of computation theory, even you won't do any research.
Some concepts are just universal and you will encounter them again and again in other courses. E.g. finite state machines, you need to know them when you are learning string matching algorithms, and compilers. Another example, you will learn some reduction algorithms (transforming from one model to another model) in computation theory, these things teach you how to think abstractly and algorithmically.
The greatest of all human faculties is the power of abstraction. That is what separates us from the animals. The more we exercise this power, the more successful we are in solving problems.
Playing chess may seem a futile pastime to some and never of practical use to any, but it goes a long way to give the player the ability to think ahead every time an important decision is to be made.
Besides, it reveals the elegance and simplicity that is hidden beneath layers of ugly syntax and brain-dead code we sift through every day just to make a living.
In addition to the usefulness of various tools (regular expressions, context free grammars, state machines etc.) in your daily life as a programmer, a good theoretical computer science course will have taught you how to model certain problems in a way that you can tackle effectively.
Solutions that seem clever to people without training in this discipline will seem natural and "the right way" to people who have. I recommend that you pay close attention to what's going on in your course since it will give you a very powerful toolset that will help you as a programmer and as an abstract thinker.
The importance of computation theory will depend on what you do with your life. If you want to be a computer scientist, then it is an important basis for your future studies.
If you just want to be a programmer or software engineer, then you will probably never use the knowledge again.
Theory of computation is sort of a hinge point among computer science, linguistics, and mathematics. If you have intellectual curiosity, then expose yourself to the underlying theory. If you just want to dip lightly into making computers do certain things, you can probably skip it. Me? I loved it. But I also liked topology, so I may not be a typical developer in that respect.
It’s really not without its practical aspects in regards to software engineering.
For example, you may be tempted to parse some programming language as input to your program with regular expressions.
Computer science theory proves why this is a bad idea (most programming language syntaxes are not regular), and it can never be overcome no matter how much you'd like to try.
Other examples may include NPC problems, etc.
Basically, computer science theory can teach you many important things with regards to reasoning. But it also describes the fundamental limits to programming and algorithms.
"Know your limits"
Some practical examples:
Before spending a lot of time on a problem, you'll want to know:
If the problem can't be solved.
If there is a "good" (polynomial) solution, as some problems may not have good" solutions (or at least, not ones we currently know of ;))
(A bit less practical) you'll want to know if a problem is "harder" than another, that is, takes more time/space.
Every computer science engineering has to learn theory of computation as it plays a vital role...The theory of computation forms the basis for:
Writing efficient algorithms that run in computing devices.
Programming language research and their development.
Efficient compiler design and construction.
Theory of computation studies the basic primitives necessary to handle computable problems. What counts as "computable" is tacitly understood to be von Neumann machine-style processing and distinct from a Lisp machine. (The Church–Turing thesis says these are ultimately equal, but, in practice they create two very different models of computation.)
For example, here's probably a minimal set to implement basic Turing functionality in a von Neumann-style machine:
MOVE data
AND, OR, and NOT transforms
a COMPARE operator
JUMPIFEQUAL function
To get universal Turing functionality, you'll have to probably have memory addressing and a call stack.
As Man's mind gets more complex, the target for what counts as "computable" gets higher and higher. There probably is no upper bound.
I've spent (on and off) the past two and a half years learning C, from books like the k&r. I soon came to the realization that I found the prose difficult to understand, etc. I read the "Teach yourself C in 21 days" book first, but I couldn't even understand it. Now that I have a fair knowledge of how to use the fundamentals of C (this doesn't include pointers/structures, but basic design) I was thinking of reading the Teach yourself C in 21 days again, from pointers, so I can start working on programs. I didn't finish the k&r because of its level of difficulty with it's prose, and some of the exercises. I know a score of people that didn't even use books to learn how to code. So, I'm asking, if anyone thinks is a good idea that I just read the 21 day book, and move forward from there, and read more advanced books like APUE or, expert C programming, and so on.
It's prose is simple to understand, and I really want to get into some projects, where I'd probably realize certain pitfalls for myself.
Could you give me any advice. I'm not in a hurry, but I'm eager to get things done!
K&R is the best book to learn C from, IMO.
I agree with Peter Norvig: It's not possible to proficient in anything in 21 days. All you're trying to do is get a feel for the syntax so you can start writing simple programs.
I think you need to start writing some simple programs as soon as possible. Try out what you know and get comfortable with that. Then read a new chapter and try that out.
Writing code brings it to life. Reading books is very dry. I sometimes read a book from start to finish to learn a new language, but that's only because I've learned so many languages by now that I'm not learning the concepts - just the syntax and the odd novel feature.
=== From my personal experience and IMO, you are doing it backwards. Start with Hello World! ===
When I am learning a new programming language, I don't just say I "will learn this language because I hear it is good to learn". I need to have a reason to learn the language. Otherwise if I don't have a reason to learn the language I will not learn or retain anything I read about the new programming language.
I learn by doing and that is why I start with doing the "hello world" example.
C is not the most user friendly language and does not have many "easy small programs" that you can implement and do something cool with... and that is why it is hard to get into.
And that is why I think you really need a good motivator to learn C.
Right now the current motivator you have is "I should learn C because people say it is good to learn" ... and I don't think that is enough of a motivator to learn C. One suggestion would be to write a GTK GUI application in C... that will teach you a lot about C and the end product will be something very concrete and cool to point to ( hence a good motivator ).
The fastest way to learn is to need to know it. If you've got a problem you need to solve, say, you need to calculate an optimal tournament order for your community's soccer teams, then you will be better motivated to learn how to solve the problem. You will end up encountering one difficulty after another as you learn, but they will always be of the form 'my program doesn't do x' rather than 'i don't really get x'. Each exercise of fixing each shortcoming will teach you new things about the language and how to use them. The added benefit of having a clear goal in mind, with many small successes along the way will keep you energized and provide positive feedback to your learning.
The K&R C book will provide you with just the right tools to approach a problem using C idioms.
If you need some generic ideas for problems to solve, try projecteuler.net, or search for related questions here on stack overflow
Gosh, 2.5 years seems like an awfully long time! I used to be an instructor with a commercial training company and our C course, which covered the entire language including "advanced" stuff like function pointers, only took 4 days! Which bits are you finding particularly difficult?
K&R is a great book (maybe it is all you need to earn C) all you need is some patience and hard work
Use those 21 days by doing the exercises in K&R.
Pick easy ones first. When you get stuck, you can ask Stackoverflow. ;)
The best way for you to learn C is to put in practice the topics you are learning.
For example if you are going through basic principles like looping and if statements, you can build yourself a little program that simulates the functionality of counting changes on a vending machine.
Once you move on to more advanced topics like pointers and data structures you can build a Logo like program. The user can input each step of the turtle's movement, and you can store that in a linked list. Afterwards you can perform pointer operations on your data structure by deleting, reversing, or adding nodes.
So the idea is to start coding to test what you learned. You will find that some of the time you don't really understand even though it seemed clear in the book. As a good programmer you will end up having to revisit the topics until you actually know it.
Teaching yourself C in 21 days is like teaching yourself handgun safety in 2 minutes. The results of either are undefined, except C lacks a safety or decocking mechanism.
In reality, its going to take the better part of two years to achieve any kind of definition of proficient. You'll gain this by shooting yourself in the foot, repeatedly, while learning from a determination to tackle practical problems using C.
It takes the better part of 9 months to discover blinking, while blinking is a very complex process, C is not always instinctual.
I think you have to use some libraries like GWT because we learn more when we see how to use the code of other people. And doing some UI is far more motivating than writing console apps. It's funier to manipulate Windows, Button, CheckBox than int, float, double (and not so much difficult).
First of all, if you have no programming experience at all, you can't learn a language in 21 days. You will need at best several months.
On the other hand, if you have programming experience, my opinion is that the best thing you could do is find a "cookbook", like the ones available here. I think Deitel had something like a C cookbook, you should visit their site and see. After you find a book like that, you have to start writing code, all sorts of code, to cover most of the problems that you could encounter while working on a project: memory management/data structures,input/output,networking,etc.
Good luck!
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Being an aspiring Apple developer, I want to get the opinions of the community if it is better to learn C first before moving into Objective-C and ultimately the Cocoa Framework?
My gut says learn C, which will give me a good foundation.
I would learn C first. I learned C (and did a lot in C) before moving to Obj-C. I have many colleagues who never were real C programmers, they started with Obj-C and learned only as much C as necessary.
Every now and then I see how they solve a problem entirely in Obj-C, sometimes resulting in a very clumsy solutions. Usually I then replace some Obj-C code with pure C code (after all you can mix them as much as you like, the content of an Obj-C method can be entirely, pure C code). Without any intention to insult any Obj-C programmer, there are solutions that are very elegant in Obj-C, these are solutions that just work (and look) a lot better thanks to objects (OOP programming can make complex programs much more lovely than functional programming; polymorphism for example is a brilliant feature)... and I really like Obj-C (much more than C++! I hate the C++ syntax and some language features are plain overkill and lead to bad development patterns IMHO); however, when I sometimes re-write Obj-C code of my colleagues (and I really only do so, if I think this is absolutely necessary), the resulting code is usually 50% smaller, needs only 25% of the memory it used before and is about 400% faster at runtime.
What I'm trying to say here: Every language has its pros and cons. C has pros and cons and so does Obj-C. However, the really great feature of Obj-C (that's why I even like it more than Java) is that you can jump to plain C at will and back again. Why this is such a great feature? Because just like Obj-C fixes many of the cons of pure C, pure C can fix some of the cons of Obj-C. If you mix them together you'll receive a very powerful team.
If you only learn Obj-C and have no idea of C or only know the very basics of it and never tried how elegantly it can solve some common problems, you actually learned only half of Obj-C. C is a fundamental part of Obj-C. The ability to use C at any time and everywhere is a fundamental feature of it.
A typical example was some code we used that had to encode data in base64, but we could not use an external library for that (no OpenSSL lib). We used a base64 encoder, entirely written using Cocoa classes. It was working okay, but when we made it encode 200 MB of binary data, it took an eternity and the memory overhead was unacceptable. I replaced it with a tiny, ultra compact base64 encoder written entirely as one C function (I copied the function body into the method body, method took NSData as input and returned NSString as output, however inside the function everything was C). The C encoder was so much more compact, it beat the pure Cocoa encoder by the factor 8 in speed and the memory overhead was also much less. Encoding/Decoding data, playing around with bits and similar low level tasks are just the strong points of C.
Another example was some UI code that drew a lot of graphs. For storing the data necessary to paint the graphs, we used NSArray's. Actually NSMutableArray's, since the graph was animated. Result: Very slow graph animation. We replaced all NSArray's with normal C arrays, objects with structs (after all graph coordinate information is nothing you must have in objects), enumerator access with simple for loops and started moving data between the arrays with memcopy instead of taking data from one array to the other one, index for index. The result: A speed up by the factor 4. The graph animated smoothly, even on older PPC systems.
The weakness of C is that every more complex program gets ugly in the long run. Keeping C applications readable, extensible and manageable demands a lot of discipline of a programmer. Many projects fail because this discipline is missing. Obj-C makes it easy for you to structure your application using classes, inheritance, protocols and so on. That said, I would not use pure C functionality across the borders of a method unless necessary. I prefer to keep all code in an Objective-C app within the method of an object; everything else defeats the purpose of an OO application. However within the method I sometimes use pure C exclusively.
You can readily enough learn C and Objective-C at the same time -- there's certainly no need to learn the minutiae of C (including pointer arithmetic and so on) before starting with Objective-C's additions to the language, and as a novice programmer getting underway with Objective-C quickly may help you to start "thinking in objects" more quickly.
In terms of available resources, Apple's documentation does typically assume familiarity with C, so starting with The Objective-C 2.0 Programming Language won't be of much benefit to you. I would invest in a copy of Programming in Objective-C by Stephen Kochan (depending on how quickly you want to get underway, you may consider waiting for the second edition):
Programming Objective-C Developers Library
Programming Objective-C 2.0 Developers Library
It assumes no prior experience, and teaches you Objective-C and as much C as you need.
If you're feeling a little ambitious, you might start with Scott Stevenson's "Learn C" Tutorial, but it does have some prerequisites ("You should already know at least one scripting or programming language, including functions, variables and loops. You'll also need to type commands into the Mac OS X Terminal.").
(Just for the record and for context: I learned both at the same time back in 1991 -- it didn't seem to do me any harm. I did, though, have a background in BASIC, Pascal, Logo, and LISP.)
I thought a lot about this issue before writing my book on Objective-C. First, I really believe that learning the C language before learning Objective-C is the wrong path. C is a procedural language containing many features that are not necessary for programming in Objective-C, especially at the novice level. In fact, resorting to some of these features goes against the grain of adhering to a good object-oriented programming methodology. It’s also not a good idea to teach all the details of a procedural language (and attacking a problem's solution with functions and structured programming techniques) before learning an object-oriented one. This can start the programmer off in the wrong direction potentially leading to developing the wrong orientation and mindset for fostering a good object-oriented programming discipline. Just because Objective-C is an extension to the C language doesn’t mean you have to learn C first!
I think that teaching Objective-C and the underlying C language as a single integrated language is the right approach. There's no reason to learn that a "for" statement is from the C language and not from its superset Objective-C language. Further, why learn in detail about things like C arrays and strings (and manipulating them) before learning about array (NSArray) and string objects (NSString), for example? Many C texts devote a lot of time to structures, and pointers to structures, and iterating through arrays with pointers. But you can start writing Objective-C programs without knowing about any of these C language features. And for a novice programmer, that's a big deal. Not only does that shorten the learning curve but also reduces the amount of material that has to be learned (and some of it selectively filtered out) to writing Objective-C programs.
I do agree that you will want to learn most, if not all, of the underlying C features, but many can be deferred until a solid grasp of defining classes and methods, working with objects and message expressions, and understanding the concepts of inheritance and polymorphism are well-understood.
I'd dive right in with Objective C - if you've already got a few languages under your belt, it's not the syntax which is the learning curve, it's Cocoa.
I think that, for the most part, learning C is a good idea no matter what arena you're going in to, at least to get the hang of the inner workings of software development before using prepackaged goods, that way if something goes wrong you have a better chance of understanding the inner workings. There's plenty of discussion about this on SO, and it's a rather subjective question, but in general you will inherently be using C within your Objective-C code, so I guess it's really up to you. I'm a ground up kind of person, but sometimes it can get in the way and I know several smart people who worked their way from the top down, I think the important part is that you get to understanding the inner workings as it will set your capabilities apart from those who don't as well as increase your capabilities.
It's a good idea to learn C before learning Objective-C, which is a strict superset of C. This means that Objective-C can support all normal C code, so the code common to C programs is bound to show up even in Objective-C code.
In addition to looking at things purely from a language point of view, you will find that Mac OS X is a complete Unix operating system. All the system level libraries are written in C.
It is probably possible to learn both at the same time, but I think you will appreciate and understand Objective-C more if you have a solid working knowledge of C first.
I'd learn Objective-C and learn as much C as you need as you go along.
The areas of C that you won't depend on much:
Pointer arithmetic and arrays. I haven't used C arrays at all.
C strings. Objective-C's strings do the job nicer and safer.
Manual memory management if you use GC in Obj-C 2.1. I highly recommend this route for development speed and performance reasons.
As you learn Objective-C and Cocoa, you cannot avoid learning bits of C. For example, rectangles are common represented by CGRect, a C struct.
If you have time, by all means learn C. As others have said here, Kochan's book (second and first editions) is excellent as a book to dip into.
There are a lot of things you can't do purely in Objective-C, so learning some basic C skills will be pretty critical. You'll at least need to understand variable declarations and the basic C library functions, or you'll be frustrated.
Honestly, so many languages are based on the C syntax that it's a good thing to be familiar with. I'd take a week or two to familiarize yourself with C regardless.
That said, I did just teach myself Objective C, and I have to be honest: I didn't find my C experience to be as useful as I would have thought. Objective C was definitely eye-opening for me.
You can jump directly into Objective-C, with the following benefits:
You'll learn "some" C in the way.
You'll learn the C parts that are relevant for you .
At least for me is easier to learn a new language when I'm interested in some specific app or sample, and I fail when I have to learn other thing that is not exactly what I'm interested on.
You can always refine your C knowledge later if you get interested in lower level programming.
Better, I don't know, even less as I am not familiar with Objective-C.
But bases of C aren't so hard to learn, it isn't a very complex language (in terms of syntax, not in terms of mastering!), so go for it, it won't be wasted time.
Personally, I think it is always a good idea to learn C, it gives a good insight of how computer works. After all, most languages and systems are still written in C. Then move on! :-)
PS.: By "move on", I didn't mean "drop it", just "learn more, learn different". Once you know C, you might never drop it: Java uses JNI to call C routines for low level stuff, Python, Lua, etc. are often extended with C code (Lua reference even just assumes some C knowledge for some functions which are just a thin wrapper to the C function behind), and so on.
Yes, learning C language before any other advanced langauges will help you to learn quiclky other langauges.
According to Wikipedia, Objective-C is a strict super-set of C. This being the case, I would suggest learning C first. Then when you learn Objective-C it will be clear what parts are added as part of Objective-C.
C gives you very little abstraction from assembly. Some C Compilers will even let you inline assembly. This can be very useful for thinking about how the computer works, which is important to know.
That being said, if you're really interested in Object-C don't let yourself get stuck writing something in C just because its "good for you". You don't need to frustrate yourself while you're trying to learn a new skill set. It is important that you have fun with what you're doing.
Do you want to be a hard-core developer? Then learn c first.
The books you need to completely master c are some of the best writings in technology. Here's what you need:
C Programming Language
The Standard C Library
Objective C is sufficiently different from C as to not merit learning C first.
From a syntax / language-family perspective one is almost better off studying SmallTalk (on which objective C is based)
From a practical perspective, focus your efforts on learning one language at a time.
Later, if you wish to learn another language, C++, Java and Python are 1) easy to learn as a bunch 2) popular and thus marketable 3) powerful.
You should have a basic knowledge of C before starting Objective_C, but there's no need master every detail of C.
I've published my notes after reading "Programming in Objective-C" in case it helps someone else.
learn objective c with programming-
Depending on many languages you already know it may be a better idea to just start learning Objective-C. The foundation in most languages are basically the same, it's the syntax that is different. Learning C first isn't really going to make much of a difference when it comes to learning Objective-C.
I learned Objective-C straight away and it worked fine for about a year now, I just had some difficulty reading C code when I downloaded project to see how they work, but now I really feel the need to learn C. You can try learning ObjC without C, but sooner or later, you will need C.
IMHO one should first learn at least some C and especially about pointers. That’s even more important if one comes from a language that doesn’t have pointers. A lot of people ask about code like
NSString *string = [[NSString alloc] init];
string = #"something";
since they don’t know about the distinction between a pointer and the object it points to.
Of course one doesn’t have to learn all of C before one can start with Objective-C, but some fundamental things are absolutely necessary.
Heck no, go straight to objective C!
I moved from ActionScript 3 to Objective C, and I already have an intern at a company!
Do what you want.
If you learn some other language prior, then you will always have confusion in writing right syntax. I do not know purpose, but Object C uses weird (not common) syntax for calling object methods. It names it as sending messages, yes, it is true accordingly pure Object Oriented concept, however most Object oriented languages name that as calling method and use more traditional syntax of calling methods. Garbage collection is also something very odd, Object C is based on old school reference count. So you will have difficulties to accept it if you switch from other language.
I am writing a book Object C quick migration guide for C/C++ programmers hoping to help people to pickup all differences quicker.
This is a quick slightly subjective question I need to ask. In order to become a proficient C programmer, I felt I'd learn C from the k&r. I find the book a little easygoing, difficult to understand sometimes but easygoing on the whole.
My question here is, do I have to absoultely do all the exercises (even those that stumped me) to become a proficient programmer in C? Or can I skip most of them? The format and layout of the questions asked are ... difficult, at best without using the tools available to C's rich set of libraries.
First, best of luck with your learning of C.
What to do to become a proficient programmer per se, is something very very crude. I will answer that with an analogy. You may complete all the exercises at the end of the book, but fail to complete the first practical program you are assigned to work upon. In another case, you might have failed to complete any exercise but yet you are able to complete your first assignment. Who do you think is in a better place? I would leave the discussion to interpretation.
Exercises at the end of the book are meant to make a person going through the text familiar with nuances of problems, code situations, programming techniques. These generally are meant to test the practical implementation of the text you might have just read. These are problems which would give you an oversight of what come in usually in daily practice. As with any exam, if you are unable to solve one particular problem, it does not mean that you are don't know anything.
My suggestion would be to try all the problems. Mark down which stumped you and revisit them after some time when you have a better grasp of the topic, may be after you solved some more problems, or went through another good resource.
Try and read more on the topic using the Internet or elsewhere.
As for the book, any book that makes you understand is good, if it fails, its not worth for you.
Have fun and Keep Walking !!!
You need to do all the exercises. Then you need to spend 10 years suffering the pain of C. Then you get to be initiated into the conclave with yak blood.
But, seriously:
You learn by doing. Whether you prefer to do by completing all or some of those exercises is irrelevant. Myself, I would do as many as I could, then choose a project I'd enjoy.
For me, that would be a text-mode share portfolio manager, you're likely to have other interests.
But you'll learn faster if you're using it for something you enjoy - that's unlikely to be anything to do with the exercises in K&R.
I think K & R is a very good resource to start learning C.I think you should try and get at least 80% of the questions on your own, even though a lot of the questions asked is rewriting libraries, it helps you to start thinking in "C".
I'm not sure K&R is the correct resource to learn modern C, especially since C has come quite a long way since the K&R days.
Anyway: when it comes to exercises, you should especially do those exercises that stumped you. Your goal is to learn things, not to get through the book the fastest way possible.
I learned C from K&R (first ed, a long time ago) and never did any of the exercises. I don't claim that this made me the brilliant programmer I am today, but it doesn't seem to have done me any harm either :-)
Even a proficient programmer doesn't know everything about the language; you should try and do the exercises that you find difficult - if you try something challenging you'll learn something along the way.