What is wrong with using turbo C? - c

I always find that some people (a majority from India) are using turbo C.
I cannot find any reason to use such outdated compiler...
But I don't know what reasons to give when trying to tell them to use modern compiler(gcc,msvc,...).

Turbo C is a DOS only product. This means that it no longer runs "natively" on 64-bit versions of Windows, and must be run inside the XP compatibility penalty box.

While there are plenty of reasons not to use Turbo C (it's old, predates standards, generates 16-bit code, etc.), it's not valid to answer a question like "How do I do X in Turbo C?" with "Just use GCC". That would be like somebody asking "How do I do X with my 1992 Toyota?" and you saying "Just get a newer car".
People who are using Turbo C are probably doing so because it's a requirement, not because they don't know about anything better. Odds are it's for a programming class where the assignments they turn in have to work in that compiler. When I was grading C++ assignments, it didn't matter what compiler the students used, but they had to compile and run properly with the compiler I was using.

I would say support and standards compliance would be the two big issues for me.
Good luck even finding Borland/Inprise/Borland/Codegear/Embarcadero, or whatever they call themselves nowadays. Even more kudos if you can get them to admit these products exist (although I did at some point get them from the Borland museum on BDN).
Performance can be important but the vast majority of applications I write spend 90% of their time waiting for the user (I don't do genome sequencing, SETI analysis or protein folding - the market is pretty small).
Honestly, if I have the choice between two free products (where obviously money is not an issue), I'll always select the best (that would be GCC for me).

Turbo C generates 16-bit X86 code. Kiiinda nice when you're developing on a 16-bit x86 processor.
Been there. Done that.
The pragmatic reasons for changing are: gcc is under development, with bug-fixes. It deploys on modern operating systems and modern chips natively.

It was also my first compiler (4 yrs ago), though I switched to gcc soon enough when I learned it didn't follow latest standards and relied on features that are considered deprecated or bad practice. These were enough reasons for me to make the switch.

The most important reason you should use decent C compiler is performance. Since GCC optimizes the code aggressively, the compiled programs would have the performance tens of percents higher than before.

Turbo C is much simpler to configure & use, runs on old DOS machines. Also it is compact in size.I guess that is the reason.
However, it does take a very little advantage of modern processors.

Related

Why compilers are written in C/C++ instead of using CoffeeScript (JavaScript, Node JS)?

I am exposed to C because of embedded system programming, and I think it's one wonderful language in this field. However, why is it used to write compilers? If the reason why gcc is implemented in C/C++ is that there aren't many good languages at that time, there's no excuse for why clang is taking the same path (using C/C++).
Is it for performance reasons? Mostly interpreted languages are a bit slower compared with compiled languages, but I guess the difference is almost negligible in CoffeeScript (JavaScript), because of Node.js.
From the perspective of developers, I suppose it's much easier to write one compiler using high level languages. Unfortunately, most of compilers out there are written in C/C++. Is it just because of legacy code?
Response to comments:
Bootstrapping is just one way to illustrate that this language is powerful enough to write one compiler. It shouldn't the dominant reason why we choose the language to implement the compiler.
I agree with the guess given below, that "most compiler developers would answer because most of compiler related tools (bison, yacc) emit C code". However, neither GCC nor Clang use generated parser, they implemented one themselves. This front-end process is independent of targeting architecture, and should not be C/C++'s strength.
There's more or less consensus that performance is one key factor. Indeed, even for GCC and Clang, building a reasonable size of C project (Linux kernel) takes a lot of time. Is it because of the front-end or the back-end. I have to admit that I didn't have much experience on backe-end of compilers, as we finished the course on compiler with generated LLVM code.
I am exposed to C because of embedded system programming, and I think
it's one wonderful language in this field.
Yes. It's better than Java.
However, why is it used to write compilers?
This question can't be answered without asking the developers. I suspect that the majority of them will tell you that common compiler-writing software (yacc, flex, bison, etc) produce C code.
If the reason for gcc is that there aren't many good languages,
there's no excuse for clang.
GCC isn't a programming language, and neither is Clang. They're both implementations of the C programming language.
Is it for performance reasons?
Don't confuse implementation with specification. Speed is an attribute introduced by your compiler and your computer, not by the programming language. GCC happens to produce fairly efficient machine code, which might influence developers to use C as their primary programming language... but in ten years time, it could* be that node.js produces more efficient machine code than GCC. Don't forget, StackOverflow is forever.
* could, but most likely won't. See Ira Baxters comment below for more info.
Mostly interpreted languages are a bit slower compared with compiled
languages, but I guess the difference is almost negligible in
CoffeeScript (JavaScript), because of Node.js.
Similarly, interpretation or compilation isn't the choice of the language, but of the implementation of the language. For example, GCC and Clang choose to compile C to machine code. Ch and CINT are two interpreters that translate C code directly to behaviour, rather than machine code. Java was once predominantly translated using interpretation, too, but is now predominantly compiled into JVM bytecode. Javascript seems to be phasing towards predominant compilation, too. Who knows? Maybe you'll see compilers written predominantly in Javascript in ten years time...
From the perspective of developers, I suppose it's much easier to
write one compiler using high level languages.
All of these programming languages are technically high level. They're mostly defined in terms of an abstract machine; They're certainly not low level.
Unfortunately, most of compilers out there are written in C/C++.
I don't consider it unfortunate that C++ is used to write software; It's not a bad programming language.
Is it just because of legacy code?
I suppose legacy code might influence the decision of a programmer. In the end though, as I said, you'd have to ask the developers. They might just decide to use C or C++ because C or C++ is their favourite programming language... Why do you speak English?
Compilers are very complex software in general. The front end part is pretty simple (parsing), but the backend part (scheduling, code generation, optimizations, register allocations) involve NP-complete problems (of course compilers try to approximate solutions to these problems). Thus, implementing in C would help compile times. C is also very good at bitwise operators and other low level stuff, which is useful for writing a compiler.
Note that not all compilers are written in C though. For example, Haskell GHC compiler is written in Haskell using bootstrapping technique.
Javascript is async, which doesn't suit compiler writing.
I see many reasons:
There is no elegant way of handling bit-precise code in Javascript
You can't write binary files easily in Javascript, so the assembler part of the compiler would have to be in a more low-level language
Huge JS codebase are very heavy to load in memory (that's plain text, remember?)
Writing optimizing routines for compilers are heavily CPU-intensive, which is not yet very compatible with Javascript
You wouldn't be able to compile your compiler with it (bootstrap), because you need a Javascript interpreter behing your compiler. The bootstrap phase wouldn't be "pure":
JS Compiler compiles NodeJS -> NodeJS runs your new Compiler -> new JS Compiler
gcc is implemented primarily in C, but that is not true of all compilers, including some that are quite standard. It is a common pattern for a compiler to be implemented in the language that it compiles. ghc is written largely in Haskell. Recent versions of guile feature a compiler implemented mostly in Scheme.
nope, coffeescript et al are still much slower than natively-compiled (and optimised) C code. Even if you take the subset of javscript that is able to be optimised (asm.js) its still twice as slow as native C.
What you hear about when people say node.js is just as fast as C code means that its just as fast as part of an overall system that does other things like read from disk, wait for data off the network, etc. In these systems the CPU is underused (especially with today's superfast CPUs) so the performance problem is not the raw processing capability of the language. Hence, a node.js server is exactly as fast as a C server if they're both stuck waiting for a network call to return data. The type of system written in node.js does a lot of waiting for network which is why people use node.js. The type of system written in C does not suit being written in node.js

API & Compiler for C in Windows 7, primary concern is

I've been using a language for Garry's mod (A Source-engine based physics sandbox game) called Expression 2 for a while now, Expression 2 is very heavily based on C, in fact it's almost exactly the same except for the fact it uses ,'s or line-returns to end lines, rather than ;'s. At the suggestion of a friend I gave a C tutorial a look and I found the syntax and language instantly familiar, albeit with a number of scary looking semi-colons.
In the past I've dabbled with writing emulators in Expression 2 and have had some success with a chip8 emulator and a few headaches working on a Gameboy emulator. You see, one problem with Expression 2 is the lack of proper bitwise functions and switch-case statements (Something I'm really looking forward to using in C). Likewise as the language is being interpreted ontop of LUA it's naturally very slow, great for making stuff in garry's mod but not much else.
The advantages of course of Expression 2 are easy interaction with the Game itself, it was built just to do that, working with C however would be the opposite, where interacting with the OS can be the most frustrating part.
First of all I'm wondering what are the best available C compilers for Windows 7 64bit out there? I've given Visual C++ a whirl but find its myrid of options and configurations daunting and as someone who's just starting out I'd really like a compiler that just takes my .c or .txt or whatever and outputs an application, I don't need all the bells and whistles when I'm just starting out.
Secondly as I'd like to write emulators I need a good method of drawing on the screen, as per my understanding that requires an API to interact with the OS? What's the best API for Windows 7 64 in terms of ease of use and 2d drawing? and where can I find some good documentation or tutorials on said API?
Thank you for reading, I'm really looking forward to getting stuck in with some real programming but unfortunately tutorials usually skip over things like choosing a compiler :/
MinGW will give you a minimalistic (yet complete) environment for building C code. Either of SDL or DirectX is fine for 2D graphics (and more).
Most will agree that Intel C/C++ compiler still stands unbeaten, but you will be most likely OK with GCC/MinGW or MSVC (although I recommend avoiding MSVC if possible).
As for the API's. DirectX is Windows native and has best support, but you will be locked to Windows platform, but if that's what you are targeting, DirectX is your best shot I guess.
Well, a lot of questions and C makes some of them hard...
As far as compilers go, you have two common options: MSVC and GCC. MSVC is the commercial (but also freely usable) compiler/toolchain which is a bit behind on C compatibility. GCC has pretty good C99 support if you need it, and available for Windows as well: http://mingw-w64.sourceforge.net (*)
As for drawing on the screen, there is of course OpenGL, but that's maybe not the handiest for simple(?) 2D. Alternatives are: Direct2D, GDI(+) or any platform abstraction library like Glib which also provide simple drawing facilities.
(*) Although C compilers often come with a large set of options (you'll start to see why when you get better at coding in the language), compiling your source files to a runnable program will come down to this (GCC):
gcc -O2 *.c -lsomeLibraries -o program.exe
For the rest, read a good book, they also would get you started!

C compiler producing lightweight executeables

I'm currently using MSVC for C++ but as I'm switching to C to write a very performance-intensive program (interpreter) I have to search for a fitting C compiler.
I've looked at some binaries produced by Turbo-C and even if its old they seem pretty straigthforward and optimized.
Now I don't know what the best compiler for building an interpreter is, but maybe you can help me.
I've considered GCC but as I don't know much about it, I can't be really sure.
99.9% a programs performance depends on the code you write and the language you choose.
you can safely ignore the performance of the the compiler.
Stick to MSVC...and dont waste time :)
If I were you, I would take the approach of worrying less about the compiler and worrying more about your own code. Write the code for the interpreter in a reasonable way. Then, profile it, and optimize spots based on how much time they take. That is more likely to produce a performance benefit than using a particular compiler.
If you want a lightweight program, it is not the compiler you need to worry about so much as the code you write and the libraries you use. Most compilers will produce similar results from the same source code.
For example, using C++ with MFC, a basic windows application used to start off at about 900kB and grow rapidly. Linking with the dynamic MFC dlls would get you down to a few hundred kB. But by dropping MFC totally - using Win32 APIs directly - and using a minimal C runtime it was relatively easy to implement the same thing in an .exe of about 25kB or less (IIRC - it's been a long time since I did this).
So ditch the libraries and get back to proper low level C (or even C++ if you don't use too many "clever" features), and you can easily write very compact applications.
edit
I've just realised I was confused by the question title into talking about lightweight applications as opposed to concentrating on performance, which appears to be the real thrust of the question. If you want performance, then there is no specific need to use C, or move to a painful development environment - just write good, high performance code. Fundamentally this is about using the correct designs and algorithms and then profiling and optimising the resulting code to eliminate bottlenecks and inefficiencies. Note that these days you may achieve a much bugger bang for your buck by switching to a multithreaded approach than just concentrating on raw code optimisation - make sure you utilise the hardware well.
You can use GCC, through MingW, Eclipse CDT, or one of the other Windows ports. You can optimize for executable size, speed of resulting executable, or speed of compilation.
C++ was designed to be backward compatible with C. So any C++ compiler should be able to compile pure C. You might want to tell it that it's C and not C++, so compiler doesn't do name mangling, etc. If the compiler is very good with C++, it should be equally good, or better with C, because C is much simpler.
I would suggest sticking with MSVC. It's a very decent system. Though if you are not convinced - compare your options. Build the same program with multiple compilers, look at the assembly they produce, measure the resulting executable performance, etc.

Why C is the language of compilers- when a Scheme subset would seem to be a better fit?

I was just listening to episode 57 of Software Engineering Radio
(TRANSCRIPT: http://www.se-radio.net/transcript-57-compiletime-metaprogramming )
I'm only 40 minutes in, but I'm wondering why C is the language of compilers- when a Scheme subset would seem to be a better fit? (or some other HLL)
(excluding the obvious reason of not wanting to rewrite gcc)
PS originally posted this at LtU http://lambda-the-ultimate.org/node/3754
I won't bother to listen to 40 minutes of radio to perhaps understand your question more thoroughly, but I would claim the exact opposite. Only a minority of compilers are written in C. I rather have the impression that (at least where appropriate), most compilers are implemented in the language they are meant to compile.
C need not be the language for compilers, but it does have some advantages. C is available on almost all platforms and that makes it easy to port and bootstrap the compiler. C is closer to the hardware and makes possible many optimizations that will be difficult to achieve in other languages. It is easy for a compiler written in C to co-exist with other languages, libraries and systems as most of them provide a C interface. It is also easy for others to extend the compiler as C is the Esperanto of system programmers.
Well, one reason will be the issue of bootstrapping the compiler on unsupported architectures. That will usually require the existence of a working compiler for that architecture, which generally means C. I remember trying to compile MIT-scheme from source, and getting really pissed off that it required MIT-scheme to be installed before I could build MIT-scheme.
Incidentally, I'm not sure I agree with your premise... C certainly seems to be the most widely deployed language, but other language compilers (e.g. MIT-scheme) are often implemented in those languages.
It's probably a combination of factors:
C compilers are available for almost every platform, making it easier to build a new compiler for a new language.
History: C is a very popular language, so it makes sense that a lot of projects are in C (no matter the project).
Scheme, specifically, is very unpopular (compared to C).
C has Flex and Yacc which help with implementing the Frontend (parser and lexer) of a compiler, if I remember right their output is limited to c code
Many compilers today are written in languages other than C (such as Scheme). To make them portable they initially generate C code as a target language.
I think a lot has to do with backends. Someone mentioned Flex and Yacc, but there's also GCC and LLVM that will help you with a lot of other important stuff, like optimizations.

We have to use C "for performance reasons" [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
In this age of many languages, there seems to be a great language for just about every task and I find myself professionally struggling against a mantra of "nothing but C is fast", where fast is really intended to mean "fast enough". I work with very rational open-minded people, who like to compare numbers, and all I have are thoughts and opinions. Could you help me find my way past subjective opinions and into the "real world"?
Would you help me find research as to what if any other languages could be used for embedded and (Linux) systems programming? I very well could be pushing a false hypothesis and would greatly appreciate research to show me this. Could you please link or include good numbers so as to help keep the "that's just his/her opinion" comments to a minimum.
So these are my particular requirements
memory is not a serious constraint
portability is not a serious concern
this is not a real time system
In my experience, using C for embedded and systems programming isn't necessarily a performance issue - it's often a portability issue. C tends to be the most portable, well supported language on just about every platform, especially on embedded systems platforms.
If you wish to use something else in an embedded system, it's often a matter of figuring out what options are available, then determining whether the performance, memory consumption, library support, etc, are "good enough" for your situation.
"Nothing but C is fast [enough]" is an early optimisation and wrong for all the reasons that early optimisations are wrong. If your system has enough complexity that something other than C is desirable, then there will be parts of the system that must be "fast enough" and parts with lighter constraints. If writing your code, for example, in Python will get the project finished faster, with fewer bugs, then you can follow up with some C or assembly code to speed up the time-critical parts.
Even if it turns out that the entire code must be written in C or assembly to meet the performance requirements, prototyping in a language like Python can have real benefits. You can take your working Python prototype and gradually replace parts with C code until you reach the necessary performance.
So, use the tools that let you get the development work done most correctly and most quickly, then use real data to determine where you need to optimize. It could be that C is the most appropriate tool to start with sometimes, but certainly not always, even in embedded systems.
Using C for embedded systems has got some very good reasons, of which "performance" is only one of the minor. Embedded is very close to the hardware, you need manual memory adressing to communicate with hardware. All the APIs and SDKs are available for C mostly.
There are only a few platforms that can run a VM for Java or Mono which is partially due to the performance implications but also due to expensive implementation costs.
Apart from performance, there is another consideration: you'll most likely be dealing with low-level APIs that were designed to be used in C or C++.
If you cannot use some SDK, you'll only get yourself in trouble instead of saving time with developing using a higher level language. At the very least, you'll end up redoing a bunch of function declarations and constant definitions.
For C:
C is often the only language that is supported by compilers for a processors.
Most of the libraries and example code is probability also in C.
Most embedded developers have years of C experience but very little experience in anything else.
Allows direct hardware interfacing and manual memory management.
Easy integration with assembly language.
C is going to be around for many years to come. In embedded development its a monopoly that smothers any attempt at change. A language that need a VM like Java or Lua is never going to go mainstream in the embedded environment. A compiled language might stand a chance if it provide compelling new features over C.
There are several benchmarks on the web between different languages. Most of them you will find a C or C++ implementation at the top as they give you more control to really optimize things.
Example: The Computer Language Benchmarks Game.
It's hard to argue against C (or other procedure languages like Pascal, Modula-2, Ada) and assembly for embedded. There is a large history of success with those languages. Generally, you want to remove the risk of the unknown. Trying to use anything other than C or assembly, in my opinion, is an unknown. Having said that, there's nothing wrong with a mixed model where you use one of the Schemes that go to C, or Python or Lua or JavaScript as a scripting language.
What you want is the ability to quickly and easily go to C when you have to.
If you convince the team to go with something that is unproven to them, the project is your cookie. If it crumbles, it'll likely be seen as your fault.
This article (by Michael Barr) talks about the use of C, C++, assembler and other languages in embedded systems, and includes a graph showing the relative usage of each.
And here's another article, fittingly entitled, Poor reasons for rejecting C++.
Ada is a high-level programming language that was designed for embedded systems and mission critical systems.
It is a fast secure language that has data checking built in everywhere. It is what the auto pilots in airplanes are programmed in.
At this link you have a comparison between Ada and C.
There are situations where you need real-time performance, especially in embedded systems. You also have severe memory constraints. A language like C gives you greater control over execution time and execution space.
So, depending on what you are doing, C may very well be "better" or more appropriate.
Check out the following articles
http://theunixgeek.blogspot.com/2008/09/c-vs-python-speed.html
http://wiki.python.org/moin/PythonSpeed/PerformanceTips (especially see Python is not C section)
http://scienceblogs.com/goodmath/2006/11/the_c_is_efficient_language_fa.php
C is ubiquitous, available for almost any architecture, usually from day-one of a processor's availability. C++ is a close second. If your system can support C++ and you have the necessary expertise, use it in preference to C - it is all that C is, and more, so there are few reasons for not using it.
C++ is a larger language, and there are constructs and techniques supported that may consume resources or behave in unacceptable ways in an embedded system, but that is not a reason not to use the language, but rather how to use it appropriately.
Java and C# (on Micro.Net or WinCE) may be viable alternatives for non-real-time.
You may want to look at the D programming language. It could use some performance tuning, as there are some areas Python can outperform it. I can't really point you to benchmarking comparisons since haven't been keeping a list, but as pointed to by Peter Olsson, Benchmarks & Language Implementations has D Digital Mars.
You will probably also want to look at these lovely questions:
Getting Embedded with D (the programming language)
How would you approach using D in a embedded real-time environment?
I'm not really a systems/embedded programmer, but it seems to me that embedded programs generally need deterministic performance - that immediately rules out many garbage collected languages, because they are not deterministic in general. However, there has been work on deterministic garbage collection (for example, Metronome for Java: http://www.ibm.com/developerworks/java/library/j-rtj4/index.html)
The issue is one of constraints - do the languages/runtimes meet the deterministic, memory usage, etc requirements.
C really is your best choice.
There is a difference for writing portable C code and getting too deep into the ghee whiz features of a specific compiler or corner cases of the language (all of which should be avoided). But portability across compilers and compiler versions. The number of employees that will be capable of developing for or maintaining the code. The compilers are going to have an easier time with it and produce better, cleaner, and more reliable code.
C is not going anywhere, with all the new languages being designed to fix the flaws in all the prior languages. C, with all the flaws these new languages are trying to fix, still stands strong.
Here are a couple articles that compare C# to C++ :
http://systematicgaming.wordpress.com/2009/01/03/performance-c-vs-c/
http://journal.stuffwithstuff.com/2009/01/03/debunking-c-vs-c-performance/
Not exactly what you asked for as it doesn't have a focus on embedded C programming. But it's interesting nonetheless. The first one demonstrates the performance of C++ and the benefits of using "unsafe" code for processor intensive tasks. The second one somewhat debunks the first one and shows that if you write the C# code a little differently then the performance is almost the same.
So I will say that C or C++ can be the clear winner in terms of performance in many cases. But often times the margin is slim. Whether to use C or not is another topic altogether. In my opinion it really should depend on the task at hand. But in embedded systems you often don't have much of a choice.
A couple people have mentioned Lua. People I know who have worked with embedded systems have said Lua is useful, but it's not really its own language per se but more of a library that can be embedded in C. It is targetted towards use in embedded systems and generally you'll want to call Lua code from C. But pure C makes for simpler (though not necessarily easier) maintenance, since everyone knows it.
Depending on the embedded platform, if memory constraints are an issue, you'll most likely need to use a non-garbage collected programming language.
C in this respect is likely the most well-known by the team and the most widely supported with available libraries and tools.
The truth is - not always.
It seems .NET runtime (but any other runtime can be taken as an example) imposes several MBs of runtime overhead. If this is all you have (in RAM), then you are out of luck. JavaME seems to be more compact, but it still all depends on resources you have at your disposal.
C compilers are much faster even on desktop systems, because of how few langage features there are compared to C++, so I'd imagine the difference is non-trivial on embedded systems. This translates to faster iteration times, although OTOH you don't have the conveniences of C++ (such as collections) which may slow you down in the long run.

Resources