Visual C++ standards compliance [closed] - c

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I was wondering if, and to what degree, does Microsoft's Visual C++ compiler conform to the current C (C90/C99) and C++ (ISO/IEC 14882:2003) standards. Unfortunately I'm only able to find partial information on the subject, I may be looking at all the wrong places.
Any pointers to related resources are much appreciated. Thanks in advance.
Edit:
Since is looks like this is a most touchy subject, I'd be content with a yes/no answer on whether MSVC wholly conforms to C90...I've come to the understanding that this is not the case for C99 (naturally), and I still have no clue about C++..
Edit2:
Thanks to everyone for their answers. I've accepted Mr. Rushakov's answer but upvoted all relevant answers, which were all helpful.

Perhaps MSDN's Nonstandard Behavior page for Visual C++ will enlighten you? Make sure you look at the version you're most interested in (the box on the right-hand side).
Since MSDN's links change all the time (and who knows why), here's the main content from the page on VS2008, so when the link breaks and someone comes across this answer, they can Google and find the correct page:
Nonstandard Behavior
The following topics are some of the
known places where the Visual C++
implementation of C++ does not agree
with the C++ standard. The section
numbers refer to section numbers in
the C++ standard.
Compiler Limits
10.3 (Paragraph 5) Covariant Return Types
14 export Keyword on a Template
14.6.2 Dependent Names
15.4 Function Exception Specifiers
16.3.2 The # Operator
21.1.1 Character Traits Requirements
Storage Location of Objects

My pet peeves, which most programmers find unimportant but which I personally find to hurt readability a lot, is that VC++ is unable to compile the following C++ code:
bool result = true and not false;
… because VC++ doesn’t recognize and, or and not (along with the rest of ISO 646) as valid tokens.
Clarification: The standard mentions the treatment of the above tokens in §2.12, marks them as reserved in §2.11 and defines an equivalence mapping for them in §2.5 to the more conventional operator representations (e.g. and corresponds to &&). It isn’t clear why they get a special status next to the other keywords. Even more confusingly, appendix C2.2 suddenly calls them “keywords”. Still, the standard is absolutely clear about their treatment and semantics. VC simply doesn’t implement these paragraphs (unless you specify the /Za flag during compilation).

Here's a nice summary in the MSDN blog titled 'C++11 Features in Visual C++ 11' which was updated March 2nd, 2012.

Visual C++ 2k3, 2k5, and 2k8, conform to C89, and C++98.
Some additional features are cherry-picked from C99, and there are a few enhancements on top of C++98.

Standards compliance for C and C++ has been rather poor for VS. Things began changing with 2005 and is getting better. VS2010 is what I am really looking at with quite a lot of features from C++0x. Most of the time though, I end up Googling with the following keywords:
msdn ANSI C conformance
msdn ISO C++ conformance
etc. when I really really need to figure out why something doesn't work as defined.

I don't use VS 2008 yet, so I can only speak for VS 2005.
It doesn't support C99. Support for C89/90 has always been good in VC and I'm not aware of any non-compliance issues with it.
C++98 support has a number of issues, some of them are documented by MS as known issues and some are plain bugs. I made a blog entry to use as a "notebook" for various VS 2005 C++ bugs I encounter in practice. If you wish, you can take a look here, although this list is probably far from being complete

__try is marked as an extension

Related

C11, 6.6.10: IB: other forms of constant expressions: additional conformance documentation is needed

Why it (seems that it) is a general practice for C compiler vendors to not provide to the end users an additional conformance documentation about implementation-defined behavior regarding «other forms of constant expressions» (C11, 6.6.10)?
C11, 6.6.10:
An implementation may accept other forms of constant expressions.
This fact leads to the following reactions / feedback (taken from different sources):
SO user M.M:
The compiler vendor should publish conformance documentation listing which expressions it accepts as constants, although I couldn't find
that documentation for MSVC. (leave a comment if you can!)
Source: https://stackoverflow.com/a/62161678/9881330
SO user Keith Thompson:
Admittedly the standard doesn't seem to require such documentation (which I find a little surprising).
Source: https://gcc.gnu.org/bugzilla/show_bug.cgi?id=66618 (2015-07-01 00:48:48 UTC)
Since the 6.6.10 is related to the implementation-defined behavior, and since «each implementation shall include documentation describing its characteristics and behavior» (C++ standard, section 1.9), why it is not a general practice in case of 6.6.10? If someone here represents any (industrial) C compiler vendor then please provide the reason / comment the situation.
P.S. The origin of the question is the possible portability issues related to the «other forms of constant expressions». It will be much time-saving if the end users know exactly which «other forms of constant expressions» are «accepted by the implementation» before writing the code (and not after, being surprised by the portability issues).
UPD. Note on «When making use of implementation-defined behavior, I would assume portability issues until proven otherwise». If a software product is planned to be portable between N compilers and all the N compilers support the same IB-related language feature, which is useful while writing the code, but considered implementation-defined behavior, then why not use it? The only question is that we need to know in advance that this IB-related language feature is supported between all the N compilers. (Yes, we can empirically / experimentally find it, but in case of many IB-related language features it will be probably time-consuming. It is better to have an official statement from the compiler vendor that this IB-related language feature is supported / not supported.)

Should I use ANSI C (C89)?

It's 2012. I'm writing some code in C. Should I be still be using C89? Are there still compilers that do not support C99?
I don't mind using /* */ instead of //.
I'm not sure about C89 forbids mixing declarations and code. I'm kind of leaning towards the idea that it's actually more readable to have all the declarations in one place, and if it isn't, the function is too long.
VLAs look useful but I haven't needed them yet.
Should I stick with C89 if I don't have a compelling reason not to? Are there other things I haven't considered?
Unless you know that you cannot use a C99-compatible compiler (the Visual Studio C compiler is the most prominent candidate) there is no good reason for not using the nice things C99 gives you.
However, even if you need to support that compiler you can use some C99 features - just not all of them.
One feature of C99 that is incredibly handy is being able to do for(int i = ...) instead of having to declare your loop variable on top of the function - especially since C actually has a block scope. That's the kind of declaration where having it on top really doesn't improve the readability.
There is a reason (or many) why C89 was superseded by C99. If you know for sure that no C99 compiler is available for your particular piece of work (unlikely unless you are stuck with Visual Studio which never supported C officially anyway), then you need to stay with C89 but otherwise you should certainly put yourself in a position where you can benefit from the last 20+ years of improvement. There is nothing inherently slower about C99.
Perhaps you should even consider looking at the newest C11 standard. There has been some important fixes for dealing with Unicode that any C programmer could benefit from (other changes in the standard are absolutely minimal)...
Good code is a mixture of performance, scalability, readability, and maintainability.
In my opinion, C99 makes code easier to read and maintain. Very, very few compilers don't support C99, so I say go with it. Use the tools you have available, unless you are certain you will need to compile your project with a compiler that requires the earlier standard.
Check out these links to learn more about the advantages to C99:
http://www.kuro5hin.org/?op=displaystory;sid=2001/2/23/194544/139
http://en.wikipedia.org/wiki/C99#Design
Note that C99 also supports library functions such as snprintf, which are very useful, and have better floating point support. Also, I find macros to be extremely helpful, especially when working with math-intensive applications (like cryptographic algorithms)
I disagree with Paul R's "bottom line" comment. There are multiple cases where C89 is advantageous for portability.
Targeting embedded systems, which may or may not have compilers supporting C99:
https://groups.google.com/forum/#!topic/comp.arch.embedded/WNvhw3T_9pI%5B1-25%5D
Targeting the TinyCC compiler, as might be required in a restricted environment where installing a gigantic toolchain is either impractical or not allowed. (TCC is no longer being developed, and Bellard's last statement as to ISOC99 support was that it was "heading towards" full compliance.)
Supporting dynamic compilation via libtcc (see above).
Targeting MSVC, as others have noted.
For source-compatibility with projects that may be required by their company to use the C89 standard. This is especially relevant if you're writing an open source library, and want to maximize its application in some industry.
As cegfault noted, some of the C99 features as listed on Wikipedia can be very useful, but none I would consider indispensable if your priority is portability, or any of the above reasons apply.
It appears Microsoft hasn't budged on C99 compliance. SimonRev from Beijer Electronics commented on a related MSDN thread in November 2016:
In broad strokes, the only parts of the C99 compiler that were
implemented are those parts that they needed to keep the C++ compiler
up to date.
Microsoft has done basically nothing to the C compiler since VC6, and
they haven't made much secret that C++ is their vision of the future
of native code, not C.
In conclusion, if you want portability for embedded or restricted systems, dynamic compilation, MSVC, or compatibility with proprietary source code, I would say C89 is advantageous.

Pascal to C converter [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm writing program which translate Pascal to C and need some help. I started with scanner generator Flex. I defined some rules and created scanner which is working more or less ok. It breaks Pascal syntax into tokens, for now it's only printing what it found. But I have no idea what should I do next. Are there any articles or books covering this subject? What is the next step?
Why do you want to do such a Pascal to C converter?
If you just want to run some Pascal programs, it is simpler to use (or improve) existing compilers like gpc, or Pascal to C translators, like e.g. p2c
If you want to convert hand-written Pascal code to humanly-readable (and improvable) C code, the task is much more difficult; in particular, you probably want to convert the indentation, the comments, keep the same names as much as possible -but avoiding clashes with system names- etc!
You always want to parse some abstract syntax tree, but the precise nature of these trees is different. Perhaps flex + bison or even ANTLR may or not be adequate (you can always write a hand-written parser). Also, error recovery may or not be important to you (aborting on the first syntax error is very easy; trying to make sense of an ill-written syntactically-incorrect Pascal source is quite hard).
If you want to build a toy Pascal compiler, consider using LLVM (or perhaps even GCC middle-end and back-ends)
You might want to take a look at "Translating Between Programming Languages Using A Canonical Representation And Attribute Grammar Inversion" and references therein.
The most common approach would be to build a parse tree in your front end, and then walk through that tree outputting the equivalent C in the back end. This gives you the flexibility to perform any reordering of declarations that's required (IIRC Pascal supports use before declaration, but C doesn't). If you're using flex for the scanner, tradition would dictate using bison for the parser, although there are alternatives. If you look, you can probably find a freely available Pascal syntax in the format expected by bison.
You have to know the Pascal grammar, the C grammar and built (design) a "something" (i.e. a grammar or an automata...) that can translate every Pascal rule in the corresponding C rule.
Than, once you have your tokenized stream, using some method like LR, you can find the semantic tree which correspond to the sequence of Pascal rule applied and convert every rule in the corresponding C rule (this can be easly done with Bison).
Pay attention that Pascal and C have not Context Free grammars, so more control will be necessary.

Are C++ comments considered bad style in C? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I was discussing C programming styles with some students and when we were talking about comments, one of them noted that he doesn't use C++ comments in C code because they are a bad idea. Turns out that it was based on personal experience with multi-line C++ comments, but it's not the first time I've heard that claim. So, is // considered harmful and if so, then why?
It depends what version of C you are using. C 99 allows // as a comment, whereas C 89 doesn't.
If you want to be as backward compatible as possible, don't use them. But, I think this is an extreme fringe case. I'm willing to bet almost everyone uses C 99.
Edit: Any recent version of GCC uses most of C99. You can find more info in Wikipedia.
C++ comments are not allowed as per the MISRA-C 2004 standard. Certain industries (automotive, specifically) prize MISRA compliant code and therefore, C++ comments are not allowed. I believe the same goes for other static code checking tools such as LDRA, etc...
This doesn't make them inherently bad, but it does mean that if you get into certain industries and want to work professionally, you will be actively discouraged from using C++ style comments.
If you use C++ comments in C, chances are that some C compilers won't accept your code. I would consider this harmful.
C++-style comments were added to C with the (not yet widely supported) C99 standard. While the standard itself isn't widely supported in full, some parts of it (like the C++ style comments), are supported in almost every compiler by now. Considering that they were added, it means that there's a need for them, so it's easy to figure out that it wouldn't be considered bad style -- especially if you set yourself guidelines on where to use which.
Only reason not to use them is if you want to write a well-formed C89 compilant program.
One common reason why people use // instead of /* */ is that you can "nest" the former and not the latter, and so you can comment out code that has comments in it. But you should really be using #if 0 for commenting out code in C anyways.
This really shouldn't be of any concern these days, unless you're maintaining code for written specifically to compile with ancient compilers and the likes.
"//" is supported in C99, but in C89 (which is the by far most supported dialect) it's not supported.

What C99 features are considered harmful or unsupported [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I usually write C code in C89, now some features of C99 (like intxx_t or __VA_ARGS__ or snprintf) are very useful, and can be even vital.
Before I more my requirements from C89 to C99 I wanted to know which of C99 features were widely supported and which ones were not widely supported or even considered harmful.
I know we could just check our target compiler support, but this would narrow our support a lot, and as this is for open source software, I'd prefer having a wider support.
For example, we use Solaris (suncc) compiler and gcc, but there might be other compiler we would move out of the way while we could keep compatibility with very little efforts.
For example, I never worked on Windows nor I know anything about Windows compilers, but it would be good to keep Windows compatibility.
goto is still considered harmful.
Somehow I have collected four down votes. I presented the statement above to add levity, and am only 30% serious about the concept behind it.
I expect the down votes are from youngsters who do not understand the history of programming languages. Not every single goto is evil, but–compared to 100% unadulterated spaghetti code I have worked on (millions of lines of FORTRAN 66)–it is reasonable and productive to replace as many goto statements with structured statements (for, while, do .. while, switch) as possible. But sometimes a goto is just fine when it avoids complexity, such as extra flag variables to break out of multiple nested loops.
Well, gcc is basically going to be gcc regardless of which desktop OS you're targeting.
Visual C++, being primarily a C++ compiler, isn't quite as concerned with the C99 spec. stdint.h does declare your favorite intxx_t macros. __VA_ARGS__ is available. _Bool, _Complex, and _Pragma aren't implemented on the Microsoft Visual C++ compiler. I'm pretty sure %a fields in printf/scanf haven't been implemented, though maybe VC2010 handles them. snprintf is present, but has a leading underscore and slightly different semantics.
Short answer: The "easier" a C99 feature is to implement without changing compiler grammars or replumbing the standard library, the more likely VC++ is to support it. If there's a conflict between C99 and C++, expect C++ to win.
A number of C99 features are optional, so their lack isn't technically non-conforming. I won't distinguish below.
Hmm, win doesn't have <stdint.h>, although there is an open-source version of stdint.h for Microsoft. Even when the file is implemented, many of the individual types are missing.
Complex and imaginary support is often missing or broken.
Extended identifiers and wide characters can be problem points.
See this list of C99 feature issues in gcc.
Runtime sizeof is a nightmare of compiler's writers. So I consider is harmful.
glibc does not implement a C99-conforming realloc, so realloc(ptr, 0) is not portable.
http://sourceware.org/bugzilla/show_bug.cgi?id=12547
restrict became a keyword in C99. That is implementation encroaching on users' namespace. If you have a valid C89 program that contains the word restrict, you must change your program to make it work with C99. In other words: no backward compatibility. If they were going to break backward compatibility, they should have removed gets from the standard first.
The type generic maths functions from <tgmath.h> are not necessarily widely implemented, though they do seem to be provided with GCC 4.2.1 on MacOS X 10.6.2.

Resources