As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I was discussing C programming styles with some students and when we were talking about comments, one of them noted that he doesn't use C++ comments in C code because they are a bad idea. Turns out that it was based on personal experience with multi-line C++ comments, but it's not the first time I've heard that claim. So, is // considered harmful and if so, then why?
It depends what version of C you are using. C 99 allows // as a comment, whereas C 89 doesn't.
If you want to be as backward compatible as possible, don't use them. But, I think this is an extreme fringe case. I'm willing to bet almost everyone uses C 99.
Edit: Any recent version of GCC uses most of C99. You can find more info in Wikipedia.
C++ comments are not allowed as per the MISRA-C 2004 standard. Certain industries (automotive, specifically) prize MISRA compliant code and therefore, C++ comments are not allowed. I believe the same goes for other static code checking tools such as LDRA, etc...
This doesn't make them inherently bad, but it does mean that if you get into certain industries and want to work professionally, you will be actively discouraged from using C++ style comments.
If you use C++ comments in C, chances are that some C compilers won't accept your code. I would consider this harmful.
C++-style comments were added to C with the (not yet widely supported) C99 standard. While the standard itself isn't widely supported in full, some parts of it (like the C++ style comments), are supported in almost every compiler by now. Considering that they were added, it means that there's a need for them, so it's easy to figure out that it wouldn't be considered bad style -- especially if you set yourself guidelines on where to use which.
Only reason not to use them is if you want to write a well-formed C89 compilant program.
One common reason why people use // instead of /* */ is that you can "nest" the former and not the latter, and so you can comment out code that has comments in it. But you should really be using #if 0 for commenting out code in C anyways.
This really shouldn't be of any concern these days, unless you're maintaining code for written specifically to compile with ancient compilers and the likes.
"//" is supported in C99, but in C89 (which is the by far most supported dialect) it's not supported.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
After half an hour of research on the Internet, I couldn't find any reasoned discussion of the advantages of function prototyping.
I manage in Java/Android, and am beginning a C course. Prototyping looks cumbersome compared to my previous experience, and I would like to know the reason(s) why it still exists in 2013.
I understand that life was more difficult for Ritchie and pals; however, a compiler could be written today that would generate a list of functions in a first pass, then do its usual thing using that list of functions as a current compiler would use a header file.
It probably can't persist either only because of backwards compatibility. It would be feasible to create a compiler that could switch between current operation mode, and the hypothetical new mode I just described, depending on the code it is shown.
If prototyping persists, it must therefore have an interest for the programmer, not for the compiler programmer. Am I right or wrong - and where can I find a reasoned discussion of the advantages of function prototyping vs. no prototyping?
You're forgetting that in C you can call a function whose source you don't have.
C supports binary distribution of code, which is quite common for (commercial) libraries.
You get a header that declares the API (all functions and data types) and the code in a .lib (or whatever your platform uses) file. This is typically the case for all of C's standard library; you don't always get the source to the compiler vendor's library but you must still be able to call the functions, of course.
For that to work, the C compiler must have the declarations when processing your code, so it can generate the proper arguments for the call, and of course deal with any return value correctly.
It's not enough to just rely on your source, since if you do
GRAPHICSAPI_SetColorRGB(1, 1, 1);
but the actual declaration is:
void GRAPHICSAPI_SetColorRGB(double red, double green, double blue);
the compiler cannot magically convert your int arguments to double if it doesn't have the prototype. Of course, having the prototype makes it possible to error-check that the call makes sense, which is very valuable.
Interesting idea about having the compiler have a first look over all source files to take notice of all functions prototypes.
However
libraries (object code) need to have their declarations somewhere, this is why the includes exist
Also I find convenient to be able to grep the includes as "free text", like
grep alloc /usr/includes/*
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
As a newb, like myself, to have great difficulty with searching the header files such as stdio.h for a function like getchar(). Somehow I picked up some information somewhere that I should not be afraid when looking in header files to "see how things work." (C++ Primer Plus, Stephen Prata)
I am very inexperienced with header files to say the least, and programmig in general.
In my attempt to find getchar() I found that stdio.h simply branches to more and more headers, and locating getchar() became increasinly complicated and time consuming, and I never found it. Clearly I am going about this all wrong, my intention was merely to find some source code for functions I am using.
My question therefore is: Where can I find source code to truly 'understand' what the standard functions are 'really' doing?
If you're looking for the declarations for variable order or other usage notes, just use an online reference or man.
If you're looking for the actual code, look into an implementation of the C standard library, like GNU's libc.
It's worth noting though, that implementations are not simple, and their graph of dependencies goes far and wide. They also tend to interact with the machine on a lower level than most of us are used to.
Consider libc's implementation of getchar:
int
getchar ()
{
int result;
_IO_acquire_lock (_IO_stdin);
result = _IO_getc_unlocked (_IO_stdin);
_IO_release_lock (_IO_stdin);
return result;
}
Probably not what you were expecting :).
(Note: No idea how good of reference that is for C -- it's just the one I typically use for C++.)
you shouldn't search the header files, you should use the man pages or MSDN help.
The C Standard does not mandate the standard headers files (like stdio.h) to physically exist. They can be just built-in.
If you don't know the parameters or the return value of a function, read the C Standard or the man pages.
In C header files are used (included) to pre declare functions. The functions that are predeclared may either be your own, where the implementation is in another (or the same for that matter) .c file, or an already compiled library. The stdio.h is an example of the last.
You should not have to look in the header file to find the function declaration. try using google typing 'man '
cheers
That is kinda like modifying the executable file explorer.exe to perform a simple action in Windows. Those are the base files, leave them be and write your functionality directly in your files. Also, if searching and altering files is an issue, make sure that you are using an IDE and not trying to do things by hand through notepad or another program like that.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm writing program which translate Pascal to C and need some help. I started with scanner generator Flex. I defined some rules and created scanner which is working more or less ok. It breaks Pascal syntax into tokens, for now it's only printing what it found. But I have no idea what should I do next. Are there any articles or books covering this subject? What is the next step?
Why do you want to do such a Pascal to C converter?
If you just want to run some Pascal programs, it is simpler to use (or improve) existing compilers like gpc, or Pascal to C translators, like e.g. p2c
If you want to convert hand-written Pascal code to humanly-readable (and improvable) C code, the task is much more difficult; in particular, you probably want to convert the indentation, the comments, keep the same names as much as possible -but avoiding clashes with system names- etc!
You always want to parse some abstract syntax tree, but the precise nature of these trees is different. Perhaps flex + bison or even ANTLR may or not be adequate (you can always write a hand-written parser). Also, error recovery may or not be important to you (aborting on the first syntax error is very easy; trying to make sense of an ill-written syntactically-incorrect Pascal source is quite hard).
If you want to build a toy Pascal compiler, consider using LLVM (or perhaps even GCC middle-end and back-ends)
You might want to take a look at "Translating Between Programming Languages Using A Canonical Representation And Attribute Grammar Inversion" and references therein.
The most common approach would be to build a parse tree in your front end, and then walk through that tree outputting the equivalent C in the back end. This gives you the flexibility to perform any reordering of declarations that's required (IIRC Pascal supports use before declaration, but C doesn't). If you're using flex for the scanner, tradition would dictate using bison for the parser, although there are alternatives. If you look, you can probably find a freely available Pascal syntax in the format expected by bison.
You have to know the Pascal grammar, the C grammar and built (design) a "something" (i.e. a grammar or an automata...) that can translate every Pascal rule in the corresponding C rule.
Than, once you have your tokenized stream, using some method like LR, you can find the semantic tree which correspond to the sequence of Pascal rule applied and convert every rule in the corresponding C rule (this can be easly done with Bison).
Pay attention that Pascal and C have not Context Free grammars, so more control will be necessary.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I usually write C code in C89, now some features of C99 (like intxx_t or __VA_ARGS__ or snprintf) are very useful, and can be even vital.
Before I more my requirements from C89 to C99 I wanted to know which of C99 features were widely supported and which ones were not widely supported or even considered harmful.
I know we could just check our target compiler support, but this would narrow our support a lot, and as this is for open source software, I'd prefer having a wider support.
For example, we use Solaris (suncc) compiler and gcc, but there might be other compiler we would move out of the way while we could keep compatibility with very little efforts.
For example, I never worked on Windows nor I know anything about Windows compilers, but it would be good to keep Windows compatibility.
goto is still considered harmful.
Somehow I have collected four down votes. I presented the statement above to add levity, and am only 30% serious about the concept behind it.
I expect the down votes are from youngsters who do not understand the history of programming languages. Not every single goto is evil, but–compared to 100% unadulterated spaghetti code I have worked on (millions of lines of FORTRAN 66)–it is reasonable and productive to replace as many goto statements with structured statements (for, while, do .. while, switch) as possible. But sometimes a goto is just fine when it avoids complexity, such as extra flag variables to break out of multiple nested loops.
Well, gcc is basically going to be gcc regardless of which desktop OS you're targeting.
Visual C++, being primarily a C++ compiler, isn't quite as concerned with the C99 spec. stdint.h does declare your favorite intxx_t macros. __VA_ARGS__ is available. _Bool, _Complex, and _Pragma aren't implemented on the Microsoft Visual C++ compiler. I'm pretty sure %a fields in printf/scanf haven't been implemented, though maybe VC2010 handles them. snprintf is present, but has a leading underscore and slightly different semantics.
Short answer: The "easier" a C99 feature is to implement without changing compiler grammars or replumbing the standard library, the more likely VC++ is to support it. If there's a conflict between C99 and C++, expect C++ to win.
A number of C99 features are optional, so their lack isn't technically non-conforming. I won't distinguish below.
Hmm, win doesn't have <stdint.h>, although there is an open-source version of stdint.h for Microsoft. Even when the file is implemented, many of the individual types are missing.
Complex and imaginary support is often missing or broken.
Extended identifiers and wide characters can be problem points.
See this list of C99 feature issues in gcc.
Runtime sizeof is a nightmare of compiler's writers. So I consider is harmful.
glibc does not implement a C99-conforming realloc, so realloc(ptr, 0) is not portable.
http://sourceware.org/bugzilla/show_bug.cgi?id=12547
restrict became a keyword in C99. That is implementation encroaching on users' namespace. If you have a valid C89 program that contains the word restrict, you must change your program to make it work with C99. In other words: no backward compatibility. If they were going to break backward compatibility, they should have removed gets from the standard first.
The type generic maths functions from <tgmath.h> are not necessarily widely implemented, though they do seem to be provided with GCC 4.2.1 on MacOS X 10.6.2.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I was wondering if, and to what degree, does Microsoft's Visual C++ compiler conform to the current C (C90/C99) and C++ (ISO/IEC 14882:2003) standards. Unfortunately I'm only able to find partial information on the subject, I may be looking at all the wrong places.
Any pointers to related resources are much appreciated. Thanks in advance.
Edit:
Since is looks like this is a most touchy subject, I'd be content with a yes/no answer on whether MSVC wholly conforms to C90...I've come to the understanding that this is not the case for C99 (naturally), and I still have no clue about C++..
Edit2:
Thanks to everyone for their answers. I've accepted Mr. Rushakov's answer but upvoted all relevant answers, which were all helpful.
Perhaps MSDN's Nonstandard Behavior page for Visual C++ will enlighten you? Make sure you look at the version you're most interested in (the box on the right-hand side).
Since MSDN's links change all the time (and who knows why), here's the main content from the page on VS2008, so when the link breaks and someone comes across this answer, they can Google and find the correct page:
Nonstandard Behavior
The following topics are some of the
known places where the Visual C++
implementation of C++ does not agree
with the C++ standard. The section
numbers refer to section numbers in
the C++ standard.
Compiler Limits
10.3 (Paragraph 5) Covariant Return Types
14 export Keyword on a Template
14.6.2 Dependent Names
15.4 Function Exception Specifiers
16.3.2 The # Operator
21.1.1 Character Traits Requirements
Storage Location of Objects
My pet peeves, which most programmers find unimportant but which I personally find to hurt readability a lot, is that VC++ is unable to compile the following C++ code:
bool result = true and not false;
… because VC++ doesn’t recognize and, or and not (along with the rest of ISO 646) as valid tokens.
Clarification: The standard mentions the treatment of the above tokens in §2.12, marks them as reserved in §2.11 and defines an equivalence mapping for them in §2.5 to the more conventional operator representations (e.g. and corresponds to &&). It isn’t clear why they get a special status next to the other keywords. Even more confusingly, appendix C2.2 suddenly calls them “keywords”. Still, the standard is absolutely clear about their treatment and semantics. VC simply doesn’t implement these paragraphs (unless you specify the /Za flag during compilation).
Here's a nice summary in the MSDN blog titled 'C++11 Features in Visual C++ 11' which was updated March 2nd, 2012.
Visual C++ 2k3, 2k5, and 2k8, conform to C89, and C++98.
Some additional features are cherry-picked from C99, and there are a few enhancements on top of C++98.
Standards compliance for C and C++ has been rather poor for VS. Things began changing with 2005 and is getting better. VS2010 is what I am really looking at with quite a lot of features from C++0x. Most of the time though, I end up Googling with the following keywords:
msdn ANSI C conformance
msdn ISO C++ conformance
etc. when I really really need to figure out why something doesn't work as defined.
I don't use VS 2008 yet, so I can only speak for VS 2005.
It doesn't support C99. Support for C89/90 has always been good in VC and I'm not aware of any non-compliance issues with it.
C++98 support has a number of issues, some of them are documented by MS as known issues and some are plain bugs. I made a blog entry to use as a "notebook" for various VS 2005 C++ bugs I encounter in practice. If you wish, you can take a look here, although this list is probably far from being complete
__try is marked as an extension