What sort of businesses still hire C programmers? [closed] - c

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I'm starting a job search, ideally ending up at a C shop. So far, I'm coming up empty in my local ads, and am starting to think I need to broaden my search, targeting specific types of businesses.
So, what type of places typically use this language?

C is typically used for fairly low-level development. You'll see it used in embedded systems, frequently, which is often listed as a computer engineering position (rather than computer science, or software engineering.) C is also used frequently for device drivers and 'generic' low-level code like math utility code for larger projects.
Generally the sorts of jobs that -need- C are taken by developers who've been using it forever, and have likely been in that position a long time.
Just keep looking! C is a rarity in terms of seeing a job just listed as "C Developer" as you've seen - so obviously they'll just be hard to find.
But I'd just wonder why you're exclusively looking for a C job as opposed to a language like C++ or Objective C :)
Edit:
Just a little note also, not to mislead you with the answer; C is still used for a lot of different stuff. Browsers, instant messengers, server daemons, the network code for even some code written on other languages even. The problem is this is just inefficient in terms the amount of time spent doing the work when you easily write it in Python, on .NET, or any number of other technologies. As such, it just isn't common, but the work can exist.

I work primarily as a C (and Perl) developer, because the application is mature, with a fairly long history (i.e. originally developed in the early 90s). The application suite originally was developed for Unix based graphical workstations. My previous job was a similar situation, a mature distributed application that was developed on multiple Unix platforms, originally in the early 1990s, and due to the source code size and maturity, it would be difficult to justify simply throwing that code base away to move to a new development language or even migrating to C++.
I would imagine there are still a number of larger in-house (used for internal purposes, not sold as a product) applications written in C that are still being maintained. Not entirely unlike the massive COBOL applications that large companies (insurance, finance, banking) that are also still being maintained.
For new development in C, others have already mentioned the embedded systems market, where the development is often for software put into ROM or EEPROM / flash memory where it is referred to as firmware, for microcontrollers (Microchip PIC, Atmel AVR, 80C51, 68HC11, etc.), where object code size, RAM usage, and performance matters so the usage of a programming language with fewer high-level or generic abstractions or assumptions is desirable.
One critical thing about good to great C programmers, is that they are expected if not required to know more about data structures and algorithms. Priority Queues, Binary Trees, MergeSort, QuickSort, Knuth-Morris-Pratt, and Karp-Rabin should be at least vaguely familiar. This is because the C language lacks the STL, Boost, CPAN, and other standard libraries available in other languages. This is at least partly because most implementations are type-specific (due to lack of templates or dynamic typing or similar mechanism) to have generic enough routines to be useful in practise.
Knowing more than one programming language is not a bad thing, even if you don't feel comfortable enough to claim to be comptent enough to program in the additional langages professional. A "modern" scripting or "trendy" web development language might be a good match. Perl, Python, and Ruby are good potential candidates.
For programming experience, functional languages like LISP, Scheme, Prolog*, ML, Objective Caml, Haskell, and Scala are good candidates for making you "think different." Admittedly Prolog is actually a declarative logic programming language, but it is still programming experience expanding.

To add on to Anthony's excellent answer, C is still used extensively in the development of operating systems and firmware, so you may want to try looking in that direction as well.
Good luck in your search for a job.

Things that must run close to the metal, and be fast.
So in addition to what Anthony wrote -- networking protocols, storage device drivers, file systems, the core of operating systems, are still big on C.

Because the focus of interest has commonly moved to applied and web development where you can't do much with C.
Either extend your search geography to other cities/countries or follow the industry trend and learn something new.

Most C programming jobs are in "embedded systems" ... things like televisions, cars, phones, alarms, clocks, toys. Such applications are often memory-constrained by cost reasons, so higher-level languages (eg, Python) are not an option there.
At a time when C and C++ were the predominant coding environments, it was said that 90% of the C programming jobs were for embedded work. Stuff that isn't advertised as software, and for which there are rarely any famous names or faces associated. This is even more the case today.

Linux is completely in C. So any company that contributes to Linux is likely to employ C coders. I worked for an industrial automation company that developed in C. Though most automation shops run PLCs and ladder logic.

iPhone development shops online. Try craigslist as well.
Objective-C is a slim superset of C, so your C skills translate nicely.
Good luck!

Related

Which programming languages have the most publicly available libraries?

I had assumed that there would be an easy to find list of programming languages sorted by the number of libraries that they have. I have been able to find no such thing. Is there any way to find or make such a list? Specifically, what would be the top 5? I understand that this would require defining what is and isn't a library, but I was shocked to see that I couldn't find even as much as an attempt at such a list.
Check the Website http://www.modulecounts.com/
The top three are:
JavaScript
Java
Python
Please Note - Having more libraries Does not mean it's better.
It strongly depends on the use.
I rank programming languages by their efficiency to solve problems in either a general case (otherwise unspecified) or in a specific domain such as:
Robotics
Mobile computers
Mathematics and Science
Web development
Embedded systems
Systems programming
Network and infrastructure programming
Database programming
Consider the following. According to IEEE, the popularity of a programming language is measured by quantifying these indicators:
Google Search, number of hits for the programming language
Google Trends, past year to date trend for programming language
Chatter on Twitter about the programming language
New and active repos' on Github for a particular language
SO
Hacker News
Demand on career builder and IEEE job board
The result is this
R and javascript/typescript are in the same list. However, they are primarily used in different domains. This does NOT mean that R cannot be used to write a solution that would typically be written in javascript, it simply means that it is the wrong tool for the job.
A little more detail on this.
General Purpose:
Python ~ Easy to use, easy to learn, known for data science.
JAVA ~ Popular for enterprises and client/server topos' since the intro of the concept of JVM (run code on any architecture).
C++ ~ Fast. I mean REALLY FAST! game engines are built with it.
Embedded Systems and Programs:
C ~ low memory usage, small file size. Every microcontroller in every car built after 2011 probably has a C program running on it
Assembly ~ but not so much since one needs low-level microprocessor knowledge
Rust ~ very promising because of memory leak handling.
Web:
Javascript ~ every dynamic website has js in it
HTML ~ I don't understand how this was missed
IaaS:
Golang ~ nothing even comes close to Go
It is important to understand, that the problems within the domain, are what necessitate the development of a new programming language library if one does not exist already. And not the other way around.
One can also argue, that continual improvement of technological standards, leads to new, and improved libraries at a faster pace.
This is a bit difficult question as it depends on what do you count as a publicly available library. One way is to check most used languages on GitHub. Checking out The 2020 State of the OCTOVERSE, the top three are:
JavaScript
Python
Java
I'm not sure there is any plausible, comprehensive and credible statistical data on which languages have the most publicly available libraries, that covers, at least, substantial number of programming languages (not even talking about all languages).
Therefore, your question will, most likely, lead to opinion based answers, and you should avoid asking such questions on Stack Overflow.
The only reliable resources that come to my mind, and that will give you the data about popularity of the programming languages (not publicly available APIs) are Tiobe and Redmonk.

How do I practice Unix programming in C?

After five years of professional Java (and to a lesser extent, Python) programming and slowly feeling my computer science education slip away, I decided I wanted to broaden my horizons / general usefulness to the world and do something that feels more (to me) like I really have an influence over the machine. I chose to learn C and Unix programming since I feel like that is where many of the most interesting problems are.
My end goal is to be able to do this professionally, if for no other reason than the fact that I have to spend 40-50 hours per week on work that pays the bills, so it may as well also be the type of coding I want to get better at. Of course, you don't get hired to do things you haven't done before, so for now I am ramping up on my own.
To this end, I started with K&R, which was a great resource in part due to the exercises spread throughout each chapter. After that I moved on to Computer Systems: A Programmer's Perspective, followed by ten chapters of Advanced Programming in the Unix Environment. When I am done with this book, I will read Unix Network Programming.
What I'm missing in the Stevens books is the lack of programming problems; they mainly document functionality and provide examples, with a few end-of-chapter questions following. I feel that I would benefit much more from being challenged to use the knowledge in each chapter a la K&R. I could write some test program for each function, but this is a less desirable method as (1) I would probably be less motivated than if I were rising to some external challenge, and (2) I will naturally only think to use the function in the ways that have already occurred to me.
So, I'd like to get some recommendations on how to practice. Obviously, my first choice would be to find some resource that has Unix programming challenges. I have also considered finding and attempting to contribute to some open source C project, but this is a bit daunting as there would be some overhead in learning to use the software, then learning the codebase. The only open-source C project I can think of that I use regularly is Python, and I'm not sure how easy that would be to get started on.
That said, I'm open to all kinds of suggestions as there are likely things I haven't even thought of.
Reinvent a lot of the core Unix utilities. Most of these were (and still are) written in C, so they are a good way to start off learning. Depending on your skill, pick harder or easier utilities to copy.
Try writing your own malloc. You'll learn a lot about Unix and a lot of C programming as well.
Google for computer science operating system courses and do the projects there. Many schools have these projects on public websites so you could get everything you need. Here is a link to Purdue's site. Give the shell project a shot; it was difficult, but really educational.
Here are a few stackoverflow postings discussing C/Unix programming books. Their main claim to fame is extensive fan-out to other resources.
Practice some of the idioms (understand the ins and outs of pointers etc) and pick projects that help with that. The third item in the list has an answer (disclaimer, I wrote it) with a bunch of high-level C idioms where an idiomatic C program would differ from Java.
Learning XLib or Win32 GUI programming with C is probably less useful as almost anything is better than C for GUI programming and wrapping a core C engine with another language is generally much easier - unless you're really concerned with speed. As a starting point, concentrate on 'systems' programming applications, which is where you are most likely to get real mileage from C. Learn how the C interface of one of the scripting languages like Tcl or Python works.
Writing some bit-twiddly and pointer heavy code (e.g. picking apart network packets or interpreting a protocol), or server-side items using sockets will take you right into C's core competencies. If you've got some of the WRS books, try making pthreads work (IIRC UNP2 has a big section about pThreads). For a contrast, write something that uses non-blocking I/O. This stuff is C's home turf and you'll get a good working knowledge of C by doing this. Troll through the RFCs for some ideas of network protocols to implement.
What are good Linux/Unix books for an advancing user?
What are some good resources for learning C beyond K&R
Resources for learning C program design
What’s a good way to start learning about Data Structures & Algorithms?
Algorithms in C
Are you open to book suggestions? Although it is a bit dated (where "a bit" perhaps is a huge understatement), Maurice Bach's "The Design of the Unix Operating System" is a great book. It belongs next to K&R on your bookshelf.
You might try working your way through all the examples in the book Software Tools (Amazon). Much of it is pretty pedestrian (right-justify text, de-tabify, etc.), but it's a great introduction to the Unix philosophy and basic C programming.
This is pretty interesting. As we know that Unix is designed by using the C language, it may not difficult to learn. If you are using the Windows OS, you can use a "Windows services for Unix" application to practice your programs. If you are using Unix, then we can use editors like vi.
I would recommend one thing highly.
Try and re-write all usual Linux command lines tools like ls, mkdir, cd etc.
You'll gain a lot of knowlege about programming and Linux both at the same time.
Pick the commands from the easiest, say "time" and work all the way up to the more complicated ones.
The best way to consolidate your learnings it to practise. So just choose a kind of application that interest you and start developing it (for example, a network client/server simple application).
Try to test most of the Unix APIs (files, sockets, etc.) to see how they work. You could for example get an Unix programming book, follow its chapters and test on your application everything you read, by creating your own functions. On that way, you can start to develop your own function library to be used in further projects.
Write a webserver.
Make it multi-threaded.
Have it support a new scripting language you develop (a la PHP, etc.)
Allow uploads from authenticated users.
Write a plugin for your favorite tool (i.e. integrate with SVN to give a webview).

We have to use C "for performance reasons" [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
In this age of many languages, there seems to be a great language for just about every task and I find myself professionally struggling against a mantra of "nothing but C is fast", where fast is really intended to mean "fast enough". I work with very rational open-minded people, who like to compare numbers, and all I have are thoughts and opinions. Could you help me find my way past subjective opinions and into the "real world"?
Would you help me find research as to what if any other languages could be used for embedded and (Linux) systems programming? I very well could be pushing a false hypothesis and would greatly appreciate research to show me this. Could you please link or include good numbers so as to help keep the "that's just his/her opinion" comments to a minimum.
So these are my particular requirements
memory is not a serious constraint
portability is not a serious concern
this is not a real time system
In my experience, using C for embedded and systems programming isn't necessarily a performance issue - it's often a portability issue. C tends to be the most portable, well supported language on just about every platform, especially on embedded systems platforms.
If you wish to use something else in an embedded system, it's often a matter of figuring out what options are available, then determining whether the performance, memory consumption, library support, etc, are "good enough" for your situation.
"Nothing but C is fast [enough]" is an early optimisation and wrong for all the reasons that early optimisations are wrong. If your system has enough complexity that something other than C is desirable, then there will be parts of the system that must be "fast enough" and parts with lighter constraints. If writing your code, for example, in Python will get the project finished faster, with fewer bugs, then you can follow up with some C or assembly code to speed up the time-critical parts.
Even if it turns out that the entire code must be written in C or assembly to meet the performance requirements, prototyping in a language like Python can have real benefits. You can take your working Python prototype and gradually replace parts with C code until you reach the necessary performance.
So, use the tools that let you get the development work done most correctly and most quickly, then use real data to determine where you need to optimize. It could be that C is the most appropriate tool to start with sometimes, but certainly not always, even in embedded systems.
Using C for embedded systems has got some very good reasons, of which "performance" is only one of the minor. Embedded is very close to the hardware, you need manual memory adressing to communicate with hardware. All the APIs and SDKs are available for C mostly.
There are only a few platforms that can run a VM for Java or Mono which is partially due to the performance implications but also due to expensive implementation costs.
Apart from performance, there is another consideration: you'll most likely be dealing with low-level APIs that were designed to be used in C or C++.
If you cannot use some SDK, you'll only get yourself in trouble instead of saving time with developing using a higher level language. At the very least, you'll end up redoing a bunch of function declarations and constant definitions.
For C:
C is often the only language that is supported by compilers for a processors.
Most of the libraries and example code is probability also in C.
Most embedded developers have years of C experience but very little experience in anything else.
Allows direct hardware interfacing and manual memory management.
Easy integration with assembly language.
C is going to be around for many years to come. In embedded development its a monopoly that smothers any attempt at change. A language that need a VM like Java or Lua is never going to go mainstream in the embedded environment. A compiled language might stand a chance if it provide compelling new features over C.
There are several benchmarks on the web between different languages. Most of them you will find a C or C++ implementation at the top as they give you more control to really optimize things.
Example: The Computer Language Benchmarks Game.
It's hard to argue against C (or other procedure languages like Pascal, Modula-2, Ada) and assembly for embedded. There is a large history of success with those languages. Generally, you want to remove the risk of the unknown. Trying to use anything other than C or assembly, in my opinion, is an unknown. Having said that, there's nothing wrong with a mixed model where you use one of the Schemes that go to C, or Python or Lua or JavaScript as a scripting language.
What you want is the ability to quickly and easily go to C when you have to.
If you convince the team to go with something that is unproven to them, the project is your cookie. If it crumbles, it'll likely be seen as your fault.
This article (by Michael Barr) talks about the use of C, C++, assembler and other languages in embedded systems, and includes a graph showing the relative usage of each.
And here's another article, fittingly entitled, Poor reasons for rejecting C++.
Ada is a high-level programming language that was designed for embedded systems and mission critical systems.
It is a fast secure language that has data checking built in everywhere. It is what the auto pilots in airplanes are programmed in.
At this link you have a comparison between Ada and C.
There are situations where you need real-time performance, especially in embedded systems. You also have severe memory constraints. A language like C gives you greater control over execution time and execution space.
So, depending on what you are doing, C may very well be "better" or more appropriate.
Check out the following articles
http://theunixgeek.blogspot.com/2008/09/c-vs-python-speed.html
http://wiki.python.org/moin/PythonSpeed/PerformanceTips (especially see Python is not C section)
http://scienceblogs.com/goodmath/2006/11/the_c_is_efficient_language_fa.php
C is ubiquitous, available for almost any architecture, usually from day-one of a processor's availability. C++ is a close second. If your system can support C++ and you have the necessary expertise, use it in preference to C - it is all that C is, and more, so there are few reasons for not using it.
C++ is a larger language, and there are constructs and techniques supported that may consume resources or behave in unacceptable ways in an embedded system, but that is not a reason not to use the language, but rather how to use it appropriately.
Java and C# (on Micro.Net or WinCE) may be viable alternatives for non-real-time.
You may want to look at the D programming language. It could use some performance tuning, as there are some areas Python can outperform it. I can't really point you to benchmarking comparisons since haven't been keeping a list, but as pointed to by Peter Olsson, Benchmarks & Language Implementations has D Digital Mars.
You will probably also want to look at these lovely questions:
Getting Embedded with D (the programming language)
How would you approach using D in a embedded real-time environment?
I'm not really a systems/embedded programmer, but it seems to me that embedded programs generally need deterministic performance - that immediately rules out many garbage collected languages, because they are not deterministic in general. However, there has been work on deterministic garbage collection (for example, Metronome for Java: http://www.ibm.com/developerworks/java/library/j-rtj4/index.html)
The issue is one of constraints - do the languages/runtimes meet the deterministic, memory usage, etc requirements.
C really is your best choice.
There is a difference for writing portable C code and getting too deep into the ghee whiz features of a specific compiler or corner cases of the language (all of which should be avoided). But portability across compilers and compiler versions. The number of employees that will be capable of developing for or maintaining the code. The compilers are going to have an easier time with it and produce better, cleaner, and more reliable code.
C is not going anywhere, with all the new languages being designed to fix the flaws in all the prior languages. C, with all the flaws these new languages are trying to fix, still stands strong.
Here are a couple articles that compare C# to C++ :
http://systematicgaming.wordpress.com/2009/01/03/performance-c-vs-c/
http://journal.stuffwithstuff.com/2009/01/03/debunking-c-vs-c-performance/
Not exactly what you asked for as it doesn't have a focus on embedded C programming. But it's interesting nonetheless. The first one demonstrates the performance of C++ and the benefits of using "unsafe" code for processor intensive tasks. The second one somewhat debunks the first one and shows that if you write the C# code a little differently then the performance is almost the same.
So I will say that C or C++ can be the clear winner in terms of performance in many cases. But often times the margin is slim. Whether to use C or not is another topic altogether. In my opinion it really should depend on the task at hand. But in embedded systems you often don't have much of a choice.
A couple people have mentioned Lua. People I know who have worked with embedded systems have said Lua is useful, but it's not really its own language per se but more of a library that can be embedded in C. It is targetted towards use in embedded systems and generally you'll want to call Lua code from C. But pure C makes for simpler (though not necessarily easier) maintenance, since everyone knows it.
Depending on the embedded platform, if memory constraints are an issue, you'll most likely need to use a non-garbage collected programming language.
C in this respect is likely the most well-known by the team and the most widely supported with available libraries and tools.
The truth is - not always.
It seems .NET runtime (but any other runtime can be taken as an example) imposes several MBs of runtime overhead. If this is all you have (in RAM), then you are out of luck. JavaME seems to be more compact, but it still all depends on resources you have at your disposal.
C compilers are much faster even on desktop systems, because of how few langage features there are compared to C++, so I'd imagine the difference is non-trivial on embedded systems. This translates to faster iteration times, although OTOH you don't have the conveniences of C++ (such as collections) which may slow you down in the long run.

Why is C used for writing drivers and OS codes? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Why is C used for writing drivers and OS codes?
Is there a size problem?
Are there any drivers written in other languages?
Which language were XP, Vista, and Solaris written in?
C compiles down to machine code and doesn't require any runtime support for the language itself. This means that it's possible to write code that can run before things like filesystems, virtual memory, processes, and anything else but registers and RAM exist.
In safety-critical environments (think avionics, spacecraft, medical devices, transportation, control software for process control), systems (as well as drivers) are often written using Ada or even SPARK/Ada.
To clarify: C is usually understood to be fairly low level, and pretty much like a "macro language" for assembly itself, that's also where its power is coming from (speed, size, portability).
Ada, on the other, hand has been specifically designed for safety-critical applications with verifiability in mind, to quote Ada 2005 for Mission-Critical Systems:
Ada [9] is the language of choice for many critical systems due to its careful design and the existence of clear guidelines for building high integrity systems [10]
That's also where Ada's support for strong typing comes in, as well as a number of other important features (quoting design for safety):
Programming languages differ wildly in
their appropriateness for use in
safetyrelated systems. Carré et al.
identified six factors that influence
the suitability of a language for
high-integrity applications [Carré
1990]. These are:
Logical soundness
Complexity of definition
Expressive power
Security
Verifiability
Bounded time and space constraints
No standard programming language performs
well in all these areas although some
(such as Pascal and Ada) perform much
better than languages such as C or
C++. In highly critical applications
‘verifiability’ is of great
importance. Certain languages allow
powerful software verification tools
to be used to perform a wide range of
static tests on the code to detect a
range of programming errors.
[...] An
important issue in the selection of a
programming language is the quality of
the available compilers and other
tools. For certain languages validated
compilers are available. While not
guaranteeing perfection, validation
greatly increasing our confidence in a
tool. Unfortunately, validated
compilers are only available for a
limited number of languages, such as
Ada and Pascal. In addition to
compilers, developers of critical
systems will make use of a range of
other tools such as static code
analysis packages. The static tests
that can be performed on a piece of
code vary greatly depending on the
language used. To aid this process it
is common to restrict the features
that are used within certain languages
to a ‘safe subset’ of the language.
Well structured and defined languages
such as subsets of Ada, Pascal and
Modula-2 allow a great many tests to
be performed such as data flow
analysis, data use analysis,
information flow analysis and range
checking. Unfortunately many of these
tests cannot be performed on languages
such as C and C++ .
It would be really beyond the scope of this question to go into even more detail, but you may want to check out some of the following pointers:
Ada compared to C and C++
Ada vs. C
Quantifying the Debate: Ada vs. C++
Why choosing Ada as a teaching language? (Ada vs. C in University)
Comparing Development Costs of C and Ada (summary)
C / C++ / Java Pitfalls
& Ada Benefits
Is Ada a better C?
Ada, C, C++, and Java vs. The Steelman
Ada: Dispelling the Myths
Real-time programming safety in Java and Ada
If anyone wants to look into Ada some more, check out this: Ada Programming (wikibooks)
There are even programming languages that are specifically developed for highly critical applications, such as JOVIAL or HAL/S, the latter of which is used by the space shuttle program.
Is there any drivers written in any other languages?
I have seen some Linux drivers for special hardware being written in Ada, don't know about other operating systems though. However, such drivers usually end up wrapping the the C API.
Because C has the best combination of speed, low memory use, low-level access to the hardware, and popularity.
Most operating systems have a kernel written in C, and applications on top of that written in either C, C++, C# or Obj-C
C is by far the easiest language(other than assembly) to "get going" on bare bones hardware. With C, (assuming you have a 32bit bootloader such as GRUB to do the hard mode switching) all you must do is make a little crt0.asm file that sets up the stack and that's it(you get the language, not including libc). With C++ you must worry about dynamic casts, exceptions, global constructors, overriding new, etc etc.. With C# you must port the .Net runtime(which on it's own basically requires a kernel) and I'm not sure about Obj-C, but Im sure it has some requirements also...
C is simply the easiest language to use for drivers. Not only is it easy to get started with, but also it's easy to know exactly what happens at the machine level. Their is no operator overloading to confuse you and such. Sure it's handy in a "good" environment, but in Ring 0 where a bad pointer not only crashes your application, but usually results in a triple fault(reboot), blue screen, or kernel panic. You really like knowing what goes on in your machine..
"why we are using C language for writing drivers and OS codes.?"
So that programmers won't have to learn new syntax of each new assembly language for each new kind of machine.
"Is there any drivers written in any other languages?"
Historically, assembly languages. I don't remember if PL/S or BLISS could be used for drivers. Maybe B. In modern times, a few brave people use C++ but they have to be very careful. C++ can be used a bit more easily in user mode drivers in some situations.
Lisp machines had their operating systems written in Lisp, which shows that you don't have to use C or assembly. The Lisp machine business was destroyed by the availability of cheap PCs, whose operating systems were of course written in C and assembly.
C was one of the very first languages (that wasn't assembly) that was suitable for writing operating systems, so it caught on early. While other languages have appeared since that are also suitable for writing operating systems in, C has remained popular perhaps due to its long history and the familiarity programmers have with its structure and syntax.
C is also a language that teaches a lot about memory management and is low-level enough to show the barrier between hardware and software. This is something that is rare among many methodologies today, that have grown more towards abstraction way above anything at the hardware level. I find C is a great way to learn these things, while being able to write speedy code at the same time.
Remember that C was originally developed for writing operating systems (in this case - Unix) and similar low-level stuff. It is wery close to the system architecture and does not contain any extra features that we want to control, how they exactly work. However, please note that the rest of the operating system, including the programming libraries, does not have to be written in the same language, as the kernel. The kernel functions are provided through a system of interrupts and in fact such programming libraries can be written in any language that supports assembler snippets.
The most popular operating system nowadays are written in C: Windows, Linux and many other Unix clones, however this is not the rule. There are some object-oriented operating systems, where both the kernel and the programming interface are written in an objective language, such as:
NeXTSTEP - Objective-C
BeOS - C++
Syllable - C++
See: Object-oriented operating system on Wikipedia
Note that in Linux, it is possible to write kernel drivers in the languages other than C (however, it is not recommended). Anyway, everything becomes a machine code when it comes to running it.
"C compiles down to machine code and doesn't require any runtime support for the language itself."
This is the best feature of C
It's my belief that languages like Python, Java, and others are popular mainly because they offer extensive standard libraries that allow the programmer to code a solution in less lines. Literally a ruby programmer can open and read a file in one line where in C it takes multiple lines. However beneath this abstraction are multiple lines. Therefore the same abstraction be done in C and it's recommended. Oddly it seems the C philosophy is not to reduce the total lines of code so there is no organized effort to do so. C seems to be viewed as a language for all processor chips and naturally this means that it's hard to create any 'standard' abstracted one line solutions. But C does allow you to use #ifdef preprocessor commands so in theory you can have multiple variations of an implementation for multiple processors and platforms all on one header file. This can't be the case for python, or java. So while C does not have a fancy standard library it's useful for portability. If your company wants to offer programs that run on computers, embedded, and portable devices then C is your choice language. It's hard to replace C's usefulness in the world.
As another note for machines that have drivers in other languages, there is the SunSpot robotics platform. Drivers for devices that are connected (sensors, motors, and everything else that can communicate via the I/O pins) are written in Java by the user.

Advice for a C, CUDA, & ANN Newbie? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm a business major, two-thirds of the way through my degree program, with a little PHP experience, having taken one introductory C++ class, and now regretting my choice of business over programming/computer science.
I am interested in learning more advanced programming; specifically C, and eventually progressing to using the CUDA architecture for artificial neural network data analysis (not for AI, vision, or speech processing, but for finding correlations between data-points in large data sets and general data/statistical analysis).
Any advice about how I should start learning C? As well as ANN/Bayesian technology for analyzing data? There are so many books out there, I don't know what to choose.
Since CUDA is fairly new, there doesn't seem to be much learner-friendly (i.e. dumbed-down) material for it. Are there learning resources for CUDA beyond the NVIDIA documentation?
Further, what resources would you recommend to me that talk about GPGPU computing and massively parallel programming that would help me along?
I don't recommend trying to learn CUDA first since it's a new technology and you don't have much background in programming.
Since you don't have much experience in C (or C++), CUDA will be a pain to learn since it lacks maturity, libs, nice error messages, etc.
CUDA is meant for people who are familiar with C (C++ experience helps too) and have a problem which needs performance improvement by recoding or rethinking the solution of a well known problem.
If you're trying to solve "ANN/Bayesian" problems I would recommend creating your solution in C++ or C, your choice. Don't bother about creating threads or multithreading. Then, after evaluation the response times of your serial solution try to make it parallel by using OpenMP, Boost threads, w/e. After this, if you still need more performance, then I would recommend learning CUDA.
I think these are valid points because CUDA has some pretty cryptic errors, hard to debug, totally different architecture, etc.
If you're still interested, these are some links to learn CUDA:
Online courses:
GPGP
CIS 665
Richard Edgar's GPU Computing Pages
Forum (the best source of information):
NVIDIA CUDA Forum
Tools:
CUDPP
Problems solved in CUDA:
gpuDG
Histogram Computation
You've expressed 3 different goals:
Learning to program in C
Learning to write code for the CUDA platform
Learning to use Bayes' Nets and/or Neural nets for data analysis
Firstly: these things are not easy for people who already have several degrees in the field. If you only do one, make sure to learn about Bayesian inference. It's by far the most powerful framework available for reasoning about data, and you need to know it. Check out MacKay's book (mentioned at the bottom). You certainly have set yourself a challenging task - I wish you all the best!
Your goals are all fairly different kettles of fish. Learning to program in C is not too difficult. I would if at all possible to take the "Intro to Algorithms & Data Structures" (usually the first course for CS majors) at your university (it's probably taught in Java). This will be extremely useful for you, and basic coding in C will then simply be a matter of learning syntax.
Learning to write code for the CUDA platform is substantially more challenging. As recommended above, please check out OpenMPI first. In general, you will be well-served to read something about computer architecture (Patterson & Hennessy is nice), as well as a book on parallel algorithms. If you've never seen concurrency (i.e. if you haven't heard of a semaphore), it would be useful to look it up (lectures notes from an operating systems course will probably cover it - see MIT Open Courseware). Finally, as mentioned, there are few good references available for GPU programming since it's a new field. So your best bet will be to read example source code to learn how it's done.
Finally, Bayesian nets and Neural nets. First, please be aware that these are quite different. Bayesian networks are a graphical (nodes & edges) way of representing a joint probability distribution over a (usually large) number of variables. The term "neural network" is somewhat vaguer, but generally refers to using simple processing elements to learn a nonlinear function for classifying data points. A book that gives a really nice introduction to both Bayes' nets and Neural nets is David J.C. MacKay's Information Theory, Inference and Learning algorithms. The book is available for free online at http://www.inference.phy.cam.ac.uk/mackay/itila/. This book is by far my favorite on the topic. The exposition is extremely clear, and the exercises are illuminating (most have solutions).
If you're looking for a friendly introduction to parallel programming, instead consider Open MPI or Posix Threading on a CPU cluster. All you need to get started on this is a single multi-core processor.
The general consensus is that multi-programming on these new architectures (gpu, cell, etc) have a way to go in terms of the maturity of their programming models and api's. Conversely, Open MPI and PThreads have been around for quite a while and there are lots of resources around for learning them. Once you have gotten comfortable with these, then consider trying out the newer technologies.
While there's certainly programming interfaces for many other languages, C is probably the most common modern language (Fortran and Pascal are still kicking around in this area) in use in high performance computing. C++ is also fairly popular though, several Bioinformatics packages use this. In any case, C is certainly a good starting place, and you can bump up to C++ if you want more language features or libraries (will probably be at the cost of performance though).
If you are interested in data mining, you might also want to look at the open source system called Orange. It is implemented in C++ but it also supports end-user programming in Python or in a visual link-and-node language.
I don't know if it supports NNs but I do know people use it for learning datamining techniques. It supports stuff like clustering and association rules.
(Also, in case you didn't know about it, you might want to track down somebody in your B-school who does operations management. If you're interested in CS and datamining, you might find likeminded people there.)
Link: gpgpu.org Has some interesting discussion
The latest CUDA releases (3.1, 3.2) have a full featured set of functions called CuBLAS that will handle multi-coring matrix operations for you on single card setups. Paralleling the backproagation will be a bit more of a challenge, but I'm working on it.
I was able to find some great video courses free from Stanford on iTunesU
Programming Methodology (CS106A)
Programming Abstractions (CS106B)
Programming Paradigms (CS107)
Machine Learning (CS229)
Programming Massively Parallel Processors with CUDA
Each one of these courses has around 20 or so lectures so it's a investment to watch them all but well worth it.

Resources