Meaning of #use directive in legacy VC6 C code - c-preprocessor

While resurrecting old, VC6 code, I ran into the following
#use "default.h"
#use aasc.lib
#use aascz0.lib
I have have not been able to find documentation for #use. I think the directive is a combination of #include (as in #use "default.h" meaning #include "default.h") and a linkage editor directive (as in #use aasc.lib and #use aascz0.lib being propagated to the object file as maybe a /DYNAMICBASE "aasc.lib" "aascz0.lib"), but I'm not confident.
In any case, the compiler (Visual Studio 2017 Community) rejects the statements with a C1021 diagnostic ("invalid processor command 'use'").
Can any "archeologists" shed light on the #use directive?
Thank you.

This isn't VC6 code. Microsoft Visual C 6.0 documentation is still available on the web, and it makes no mention of #use directives. My guess is that you're looking at Dynamic C 6.0 code.
Dynamic C is a product of Rabbit Semiconductors, which originally was a division of Z-World, Inc but was sold off in 2006 to Digi International. This will probably become a dead link fairly quickly, but here is a Dynamic C manual; grab that while you can... it documents #use.
In particular, default.h has a set of #use directives for each Rabbit product. AASC.LIB is the Abstract Application-Level Serial Communications library; AASCZ0.LIB contains support functions for built-in Z0 functions, according to this much older reference that seems targeted to Zilog Z180 and Dynamic C version 6.x.

Related

How to compile picoProlog from source code?

I am a student in Computer Science, and I am learning about logic programming with Prolog.
I have found an interesting Prolog interpreter, picoProlog (http://spivey.oriel.ox.ac.uk/corner/Logic_Programming).
To know more about Prolog, I am trying to compile their source code, but I failed.
In this web page, they said:
The interpreter source is written in a minimal dialect of Pascal, avoiding many features including pointers, but using macros to overcome some of Pascal's limitations, in a style inspired by Kernighan and Plauger's book Software tools in Pascal. It comes with a translator from the Pascal dialect into C that can be used to build the interpreter and also source for the macro processor that is needed.
To build the interpreter on a Linux machine, just extract the tar file and type make. The building happens in several stages:
First, the Pascal-to-C translator ptc is built from C source, including a lexer and parser written with lex and yacc. The file README gives some details of the very restricted Pascal subset accepted by this translator.
Next, ptc is used to build the macro processor ppp.
Finally, the picoProlog interpreter is built from the source code in the file pprolog.x by first expanding macros using ppp to obtain a file pprolog.p, then translating to C with ptc, and lastly compiling the C code.
Text and software copyright © J. M. Spivey, 1996, 2002, 2010.
They said about compiling on Linux only, so I don't know how to compile this source code in Windows machine. Can I compile it with Turbo Pascal 7.0 (without any requirement) on Windows XP? Can you remove some part of script for Pascal compiling only?
I found this question while googling, and though it's old, I thought it would be helpful to add a definitive answer from the author of the program.
It is indeed not too hard to get picoProlog to compile with the Free Pascal Compiler. I've incorporated Marco's suggestions into the source, fixed a small bug that was revealed, and added a workaround for an odd feature of Free Pascal. The results can be found on the GitHub page:
https://github.com/Spivoxity/pprolog
with instructions for building in the README.
Note: I built this with Free Pascal under Linux on x86_64, but haven't tested it on Windows. I can't see a reason why it wouldn't work.
Edit 18 Oct 2022 -- Replaced BitBucket with GitHub.
To avoid spending more time in getting the P2C/PTC bootstrapping to run while you are probably only interested in the interpreter and not its *nix bootstrapping, I think it is easier to forget the PTC stuff and focus getting the pascal parts to compile/work with FPC 2.6.x. (the below took 10 minutes), generating a standalone Windows EXE with 10-20 code line additions.
Start with ppp, hmm, that compiles (nand works!) out of the box:
D:\dls\prlg\pprolog>fpc ppp.p
Free Pascal Compiler version 2.6.2 [2013/02/12] for i386
Copyright (c) 1993-2012 by Florian Klaempfl and others
Target OS: Win32 for i386
Compiling ppp.p
Linking ppp.exe
394 lines compiled, 0.1 sec , 30352 bytes code, 1692 bytes data
The code does looks like it is meant to have its input piped in. We haul pprolog.x through it (ppp) and it (pprolog.pp) almost compiles. There are four problems, but all are fixable by adding some code to the top, and not changing original code (marked with MVDV: in the source)
Some range check errors because integer type is too small for the 1MB stackspace that is set up. This prohibits Turbo Pascal usage, but we can workaround it by defining integer as longint.
Seems it assumes that forward functions don't need their arguments repeated while in FPC they generally do, fixed.
In the final function of ("initialize") some non standard ptc library functions are used that borrow from C (argv, argc) instead of their typical pascal equivalents. Fixed.
(reported by original author after testing) ParseFactor has a right hand recursion that is substituted by reading
the result. Enable TP mode ( {$mode tp} above the uses line), or add () to disambiguate
After these, pprolog.pp compiles with FPC:
Free Pascal Compiler version 2.6.2 [2013/02/12] for i386
Copyright (c) 1993-2012 by Florian Klaempfl and others
Target OS: Win32 for i386
Compiling pprolog.pp
pprolog.pp(487,19) Warning: unreachable code
pprolog.pp(532,19) Note: Local variable "dummy" is assigned but never used
Linking pprolog.exe
2150 lines compiled, 0.1 sec , 84400 bytes code, 13932 bytes data
1 warning(s) issued
1 note(s) issued
Some notes:
UNTESTED
I don't know if I got the range of argv/argc exactly right. (0..argc-1 while paramcount is 1-based etc) Check if necessary.
The string system predates the TP String type and is convoluted (probably because of PTC, see README), I don't know if it will work.
I've put the resulting, compiling source code at http://www.stack.nl/~marcov/files/pprolog.pp
Good luck!
Given how many different variations of Pascal have existed, my gut feeling is it's easier to get hold of a Linux environment than to adjust the Pascal source code to fit the compiler you have. And this is only the first step.
Getting a Linux environment? Try a virtualbox - https://www.virtualbox.org

Is there a system task or pre-processor directive in SystemVerilog for retrieving the used standard version?

I implemented a SV module which contains soft constraints. However, as far as I know soft constraints are only supported since 1800-2012 standard. Therefore I would like to add an alternative implementation in case a simulator is used that only supports older standard versions.
Is there a way to retrieve this information with a system task or pre-processor directive in such a way:
if($get_version_specifier == "1800-2012")
// do fancy stuff with soft constraints
else
// alternative fancy stuff
I already found an option for a similar problem by using begin_keywords, end_keywords, but I think that would not solve my issue since it only defines the set of keywords for a specific standard. And if the simulator does not support this version I guess only an error would occur.
Thanks in advance!
sebs
The problem you ask about is more complicated than it seems. Different features of SystemVerilog are implemented by different versions of tool; sometimes before the standard is released, sometimes after. I do know that some tools supported soft constraints before the release of the 1800-2012 standard, and no commercial tool that I know of has yet to support operator overloading, which was in the first IEEE 1800-2005 standard.
A better solution would be to define a set of macros like USE_SOFT_CONSTRAINTS for features that do not have universal support. Then you can include a common featureset.svh file that defines the feature set you want to use. Another good practice is to DOCUMENT the reason you added a specific feature macro (i.e the tool version you were using that didn't support the feature and why you decided it was worth the effort to implement both branches of the code).
As far as I know, there isn't any "standard" way of getting the version of the standard you are using. C++ had a similar problem before the 2011 release (see here). One answer there states that different compilers added different proprietary defines (something like the INCA macro set for the Incisive simulator). You'll have to ask your simulator vendor if a define for the version of the SV standard exists (something like SV2012_OR_GREATER).
Cadence, for example, has something like this for Specman, so if they're consistent they might have this for SystemVerilog as well. Assuming such a thing exists, you could have:
`ifdef SV_2012_OR_GREATER
// do fancy stuff with soft constraints
`else
// alternative fancy stuff
`endif
Bonus: Soft constraints are a declarative construct, so I don't see how you could use an if block to decide whether to use them or not (unless maybe if it's an if inside a constraint block). Also, I'm not sure whether you're able to truly emulate soft constraints in any way, however fancy your approach is, so I don't know if it really makes sense to try.

Why doesn't C code compile properly in Visual Studio?

When trying to compile some C code in Visual Studio, I often get numerous errors. The reason for this problem is Visual Studio's C compiler only supports an old version of C. How can I quickly fix all of my C code to be compatible with the Visual Studio compiler?
For example, I'm trying to compile websocket.c and associated headers—from http://libwebsockets.org/trac/libwebsockets. I'm getting a lot of errors about "illegal use of this type as an expression" which, according to other answers, indicates that I need to move my variable declarations to the beginning of every block.
The problem with compiling C in Visual Studio
Visual Studio does not provide full support for ANSI C. If you want C code to be portable enough to compile with Visual Studio, you'll probably have to target C89 or have it compile as C++ code. The first option is unnecessarily restrictive, unless for some reason you really really love the '89 standard C and you hate all of the new features of later standards.
Compiling as C++
The second option, compiling as C++, can be achieved, as dialer mentions in his comment, by changing the target language type. You can do this by right-clicking the source file(s), and selecting Properties, navigate to C/C++ -> Advanced and changing the Compile As option to Compile as C++ code.
You can also specify the source file type as C++ by using the /Tp <filename> switch on the command line, or use the /TP switch to compile everything as C++.
Problems with Linking
If you're linking to a library written in C, the above fix can cause linking to fail. This is because, now that you're compiling your C files as C++, the function names will be mangled. When the compiler adds the library and tries to match the name of the function you called to one exported by the library, it will fail because the name exported by the library will not be mangled.
To combat this problem, C++ allows you to specify that specific names are exported with "C" linkage, which tells the compiler that the names are not mangled. This is usually done by prefixing the function declaration with extern "C", or placing everything in a block of
extern "C" {
/* header contents here */
}
Well-disciplined C library developers know about this problem and will use techniques, such as macros, to combat it. A common technique is to detect when the user is compiling as C++, and place macros similar to these at the beginning and end of a block of declarations in a header file:
#if defined (__cplusplus)
#define BEGIN_EXTERN_C extern "C" {
#define END_EXTERN_C }
#else
#define BEGIN_EXTERN_C
#define END_EXTERN_C
#endif
If you're using well-established and well-coded C libraries, the headers probably contain something similar to this. If not, you might need to do it yourself (and if the library is open-source, submit the changes as a patch!)
The future of C in Visual Studio
There is an MSDN blog post from July 2013, which announced that a large number of C99 features have been implemented for Visual Studio 2013. Part of the reason for this seems to be that the features are mentioned in parts of some C++ standards, so they would be required anyway. The new features include new math.h functions, new inttypes.h types and more. See the post for a full list.
An earlier post gives the following tidbits:
Additionally, some C99 Core Language features will be implemented in 2013 RTM:
C99 _Bool
C99 compound literals
C99 designated initializers
C99 variable declarations
Note that there are features missing, including:
The tgmath.h header is missing. C compiler support is needed for this header.
Note that the ctgmath header was added—this is possible because that header does not require the tgmath.h header—only the ccomplex and
cmath headers.
The uchar.h header is missing. This is from the C Unicode TR.
Several format specifiers in the printf family are not yet supported.
The snprintf and snwprintf functions are missing from stdio.h and wchar.h.
Although you can expect them in the future:
We don't hate snprintf() (quite the contrary), we just missed it and ran out of time.
Note that other language features that don't have to do with the standard library are still not available.
It looks like standard C will receive more support in the future, although probably just because the implementation of more modern features is necessary to support C++11 and C++14.
<tgmath.h> and its associated compiler magic are special and I don't know our plans for them (as Pat's post explained, C++ has overloading/templates and doesn't need C compiler magic).

Is there any good reason not to use 'define' with variable argument length?

Recently I came upon this code:
#define LOG(type, str) printf(str)
#define LOG1(type, str,arg1) printf(str,arg1)
#define LOG2(type, str,arg1,arg2) printf(str,arg1,arg2)
#define LOG3(type, str,arg1,arg2,arg3) printf(str,arg1,arg2,arg3)
#define LOG4(type, str,arg1,arg2,arg3,arg4) printf(str,arg1,arg2,arg3,arg4)
The code was written recently. So I guess it can be compiled with C99.
My question is: Why not use a simple macro with variable arguments length? We would just limit ourselves to LOG macro and nothing more. Not to mention that we won't have to add LOG5, LOG6, etc. Would something terrible happen if we get a stack trace, run out of memory, anything that would make this solution useful?
I am a minimalist, if we can get fewer lines the better. But am I missing something here? Was this intentional or it is a bad coding practice?
Preprocessor support for variadic macros looks to me the only good reason. We had for decades in our project LOG, LOG1 etc, but did upgrade to ... recently.
As per wikipedia:
Several compilers support variable-argument macros when compiling C and C++ code: the GNU Compiler Collection 3.0,[2] Visual Studio 2005,[3] C++Builder 2006, and Oracle Solaris Studio (formerly Sun Studio) Forte Developer 6 update 2 (C++ version 5.3).[5] GCC also supports such macros when compiling Objective-C.
If you know your platform, then use fancy variadic macros. If there are few to support, then things could turn out to be more interesting

How to disable interface keyword on visual C++ Express 2008?

I am compiling a legacy C code here and there is a lot of variables and struct members named "interface", but VC2008 express is complaining about these, do you know how to disable this?
I already changed settings to compile the code only as a C code, but no effect on this.
Problem is that MS #defines interface to struct so that
interface Name {...}
can be used in COM c++ code.
(objbase.h:199: #define interface __STRUCT__)
Just #undef interface after including Windows.h ..
Do a
#define interface QQInterface
before your code (eg. in the header file), this way everywhere where the keyword interface is used, the compilers sees "QQInterface", which is not a keyword. If all code includes this define, you will not get compiler or linker errors.
If you are trying to compile reasonably portable C code, it might be worth disabling the Microsoft language extensions (/Za on the command line, Configuration Properties > C/C++ > Language in VS) and see if the code compiles then.
"interface" a should not be a keyword in C nor ISO C++. It is a keyword in the Managed Extensions for C++, so, I guess, somewhere in your configuration you are still telling it to create code for .NET. Make sure everywhere is set to "Native Code"
However, it's quite possible that you CANNOT set it to Native Code in the Express edition --- That's just a guess, but it reasonable considering MS positioning of the Express/Standard/Pro editions.
UPDATE: Disregard that last paragraph. MSFT insists that you can create native Win32 apps with VisualC++ Express: http://www.microsoft.com/express/vc/
I faced a similar problem while compiling C++ code which included a dbus header file. since dbus has several functions where it uses "interface" as an I/P parameter name, which happens to be C++ keyword, I got following error: error: expected ',' or '...' before 'struct'.
When I tried this:
#ifdef interface
#undef interface
#endif
it solved the issue.
Not sure if using dbus C++ binding would have been better. Anyways I was not using dbus, just had a remote dependeny on one of dbus headers, this solution just worked fine!!
You can define WIN32_LEAN_AND_MEAN to avoid inclusion of this definition. See MSDN Docs

Resources