Force LP64 data model with GCC or Clang in Windows - c

Is there a way to force GCC and/or Clang compiler to use LP64 data model when targeting Windows (ignoring that Windows use LLP64 data model)?

No, because the requested capability would not work
You are "targeting Windows", presumably meaning you want the compiler to produce code that will run under Windows in the usual way. In order to do that, the program must invoke functions in the Windows API. There are effectively three versions of the Windows API: win16, win32, and win64. Since you want 64-bit pointers (the "P64" in "LP64"), the only possible target is win64.
In order to call a win64 function, you must include windows.h. That header file uses long. If there were a compiler switch to insist that long be treated as a 64-bit integer (LP64) rather than 32-bit (LLP64), then the compiler's understanding of how to call functions and lay out data structures that use long would be wrong; the resulting program would not run correctly.
The same problem applies to the standard C and C++ libraries. If you link to an existing compiled library (as is typical), the calls into it won't work (since it will use LLP64). If you were to build one from source using a hypothetical switch to force LP64, its calls into the Windows API would fail.
But you can try Cygwin
Cygwin uses LP64 and produces binaries that run on Windows. That is possible, despite what I wrote above, because the Cygwin DLL acts as a bridge between the Cygwin LP64 environment and the native win64 LLP64 environment. Assuming you have code originally written for win32 that you now want to take advantage of a 64-bit address space with no or minimal code changes, I suspect this is the easiest path. But I should acknowledge that I've never used Cygwin in quite this way so there might be problems I am not aware of.

Related

Running C Built .exe On Another PC

I've built a programme in C with Visual Studio using standard C libraries + a couple of windows libraries.
The code just acquires user input (with scanf, so through cmd window) does some calculations based on the input and outputs some text files, that's about it.
I'm wondering what would I then need to do to run my exe on another standard Windows computer without it needing to install any additional files e.g. the whole Windows SDK.?
Is it just a case of being able to build the release version (as opposed to the debug)?
Many thanks
If you pick the right CPU target (Project Configuration Properties -> C/C++: Enable Enhanced Instruction Set) such that the binary code doesn't include instructions understood by only a very narrow subset of CPUs, and the default for Visual Studio is to use instructions that are supported by the widest set of x86 or x64 CPUs, then your program will run on almost any Windows computer. You need to distribute any DLLs that aren't part of the base Windows installation though, which includes any additional dynamically linked language runtimes such as the Visual C++ runtime.
A good way to derive the list of DLLs that you have to package with your executable is to create a virtual machine with a fresh Windows installation without any development tools in it and then try to run your code there. If you get an error for a missing DLL, add it and repeat. When it comes to the Visual C++ runtime, Microsoft provides installable packages for the different versions that you are allowed to distribute as part of your installation (Visual C++ Whatever Redistributable Package).
Also, mind that programs compiled for 32-bit Windows will mostly run on 64-bit versions, but the opposite is not true.
Generally speaking, C programs are not "portable" meaning they can't be copied over to other machines and expected to run.
There are a few exceptional cases where C executables can safely be ran on other machines:
The CPUs support the same instruction set, with the same set of compatible side-effects and possibly the same bugs.
The Operating systems support the same system api points, with the same set of compatible side-effects and possibly the same bugs.
The installed non-operating system libraries support the same api points, with the same set of compatible side-effects and possibly the same bugs.
The API calling conventions are the same between the source (platform you built the code on) and destination (platform you will run the executable on).
Of course, not all of the CPU and OS need to be 100% compatible, just the parts your compiled program uses (which is not always easy to see as it is compiled, and not a 100% identical representation of the source code)
For these conditions to hold, typically you are using the same release of the operating system, or a compatibility interface designed by the operating system packagers that supports the current version of the operating system and older versions too.
The details on how this is most easily done differ between operating systems, and generally speaking, even if a compatibility layer is present, you need to do adequate testing as the side-effects and bugs tend to differ despite promises of multi-operating system compatibility.
Finally, there are some environments that can make non-same CPU executables run on an operating system (like QEmu) by simulating the foreign CPU instruction set at runtime, interperting those instructions into ones that are compatible with the current CPU. Such systems are not common across non-Linux operating systems; and, they may stumble if the loading of dynamic libraries can't locate and load foreign instruction set libraries.
With all of these caveats, you can see why most people decide to write portable source code and recompile it on each of the target platforms; or, write their code for an interpreter that already exists on multiple platforms. With the interpreter approach, the CPU is conceptually a virtual one, which is implemented to be identical across hardware, letting you write one set of source code to be interpreted across all the hardware platforms.
I've built a programme in C with Visual Studio using standard C libraries + a couple of windows libraries.
I'm wondering what would I then need to do to run my exe on another standard Windows computer without it needing to install any additional files e.g. the whole Windows SDK.?
You don't explain what your program is really doing. Does it have a graphical user interface? Is it a web server? Do you have some time to improve it or enhance it? Can it run on the command line?
Why cannot you share the C source code (e.g. using github)?
If you want some graphical interface, consider using GTK. It has been ported to Windows.
If a web interface is enough, consider using libonion in your program, or find some HTTP server library in C for your particular platform.
But what you need understand is mostly related to linking, not only to the C programming language (read the n1570 specification). Hence read about Linkers and loaders.
You should prefer static libraries. Their availability is platform specific. Read more about Operating Systems. Also, sharing some Microsoft Windows system WinAPI libraries could be illegal (even when technically possible), so consult your lawyer after showing him the EULA that you are bound to.
My opinion is that Linux distributions are very friendly when learning to program in C (e.g. using GCC or Clang as your C compiler). Both Ubuntu and Debian are freely downloadable, but practically require about a hundred gigabytes of consecutive free disk space. Be sure to backup your important data before installing one of them.
Did you consider porting, adapting and compiling your C code to WebAssembly? If you did that, your code would run inside most recent Web browsers. Look also into Bellard's JSLinux.
Related answer here. Notice that C can be transpiled to JavaScript and then run locally inside recent web browsers.

is it possible to compile c for many architectures on linux?

I'm developing a few programs on my pc, that runs Ubuntu 64bit.
I'd like to run these applications on another pc, that runs on 32. Is possible to compile on my machine or do I need to recompile the applications on the other?
In general you need to provide the compiler an environment similar to the target execution environment. Depending on how similar or different one environment is to another, this may be simple or complicated.
Assuming the compiler is GCC, you should only need to add -m32 to your compilation flags to make them work on a 32 bit system; assuming all other things are equal. Ensure you have the necessary 32-bit dependencies installed on your system (this means the base C library dependencies as well as a 32 bit version for each library your application links against).
Since you are only compiling for x86 on a 64 bit host, the path to this is generally simple. I would recommend however setting up a dedicated environment which you can use to compile -- typically some kind of chroot (See pbuilder, schroot, chroot, debootstrap and others).
There are compiler settings/flags that should allow you to do this on your machine; which specific ones you need would depend on the compiler you are using.

Is a Linux executable "compatible" with OS X?

If you compile a program in say, C, on a Linux based platform, then port it to use the MacOS libraries, will it work?
Is the core machine-code that comes from a compiler compatible on both Mac and Linux?
The reason I ask this is because both are "UNIX based" so I would think this is true, but I'm not really sure.
No, Linux and Mac OS X binaries are not cross-compatible.
For one thing, Linux executables use a format called ELF.
Mac OS X executables use Mach-O format.
Thus, even if a lot of the libraries ordinarily compile separately on each system, they would not be portable in binary format.
Furthermore, Linux is not actually UNIX-based. It does share a number of common features and tools with UNIX, but a lot of that has to do with computing standards like POSIX.
All this said, people can and do create pretty cool ways to deal with the problem of cross-compatibility.
EDIT:
Finally, to address your point on byte-code: when making a binary, compilers usually generate machine code that is specific to the platform you're developing on. (This isn't always the case, but it usually is.)
In general you can easily port a program across various Unix brands. However you need (at least) to recompile it on each platform.
Executables (binaries) are not usable on several platforms, because an executable is tightly coupled with the operating system's ABI (Application Binary Interface), i.e. the conventions of how an application communicates with the operating system.
For instance if your program prints a string onto the console using the POSIX write call, the ABI specifies:
How a system call is done (Linux used to call the 0x80 software interrupt on x86, now it uses the specific sysenter instruction)
The system call number
How are the function's arguments transmitted to the system
Any kind of alignment
...
And this varies a lot across operating systems.
Note however that in some cases there may be “ABI adapters” allowing to run binaries of one OS onto another OS. For instance Wine allows you to run Windows executables on various Unix flavors, NDISwrapper allows you to use Windows network drivers on Linux.
"bytecode" usually refers to code executed by a virtual machine (e.g. for java or python). C is compiled to machine code, which the CPU can execute directly. Machine language is hardware-specific so it it would be the same under any OS running on an intel chip (even under Windows), but the details of how the machine code is wrapped into an executable file, and how it is integrated with system calls and dynamically linked libraries are different from system to system.
So no, you can't take compiled code and use it in a different OS. (However, there are "cross-compilers" that run on one OS but generate code that will run on another OS).
There is no "core byte-code that comes from a compiler". There is only machine code.
While the same machine instructions may be applicable under several operating systems (as long as they're run on the same hardware), there is much more to a hosted executable than that, and since a compiled and linked native executable for Linux has very different runtime and library requirements from one on BSD or Darwin, you won't be able to run one binary on the other system.
By contrast, Windows binaries can sometimes be executed under Linux, because Linux provides both a binary format loader for Windows's PE format, as well as an extensive API implementation (Wine). In principle this idea can be used on other platforms as well, but I'm not aware of anyone having written this for Linux<->Darwin. If you already have the source code, and it compiles in Linux, then you have a good chance of it also compiling under MacOS (modulo UI components, of course).
Well, maybe... but most probably not.
But if it does, it's not "because both are UNIX" it's because:
Mac computers happen to use the same processor nowadays (this was very different in the past)
You happen to use a program that has no dependency on any library at all (very unlikely)
You happen to use the same runtime libraries
You happen to use a loader/binary format that is compatible with both.

A simple explanation of what is MinGW

I'm an avid Python user and it seems that I require MinGW to be installed on my Windows machine to compile some libraries. I'm a little confused about MinGW and GCC. Here's my question (from a real dummy point of view):
So Python is language which both interpreted and compiled. There are Linux and Windows implementations of Python which one simply installs and used the binary to a execute his code. They come bundled with a bunch of built-in libraries that you can use. It's the same with Ruby from what I've read.
Now, I've done a tiny bit a of C and I know that one has a to compile it. It has its built-in libraries which seem to be called header files which you can use. Now, back in the school day's, C, was writing code in a vi-like IDE called Turbo-C and then hitting F9 to compile it. That's pretty much where my C education ends.
What is MinGW and what is GCC? I've been mainly working on Windows systems and have even recently begun using Cygwin. Aren't they the same?
A simple explanation hitting these areas would be helpful.
(My apologies if this post sounds silly/stupid. I thought I'd ask here. Ignoring these core bits never made anyone a better programmer.)
Thanks everyone.
MinGW is a complete GCC toolchain (including half a dozen frontends, such as C, C++, Ada, Go, and whatnot) for the Windows platform which compiles for and links to the Windows OS component C Runtime Library in msvcrt.dll. Rather it tries to be minimal (hence the name).
This means, unlike Cygwin, MinGW does not attempt to offer a complete POSIX layer on top of Windows, but on the other hand it does not require you to link with a special compatibility library.
It therefore also does not have any GPL-license implications for the programs you write (notable exception: profiling libraries, but you will not normally distribute those so that does not matter).
The newer MinGW-w64 comes with a roughly 99% complete Windows API binding (excluding ATL and such) including x64 support and experimental ARM implementations. You may occasionally find some exotic constant undefined, but for what 99% of the people use 99% of the time, it just works perfectly well.
You can also use the bigger part of what's in POSIX, as long as it is implemented in some form under Windows. The one major POSIX thing that does not work with MinGW is fork, simply because there is no such thing under Windows (Cygwin goes through a lot of pain to implement it).
There are a few other minor things, but all in all, most things kind of work anyway.
So, in a very very simplified sentence: MinGW(-w64) is a "no-frills compiler thingie" that lets you write native binary executables for Windows, not only in C and C++, but also other languages.
To compile C program you need a C implementation for your specific computer.
C implementations consist, basically, of a compiler (its preprocesser and headers) and a library (the ready-made executable code).
On a computer with Windows installed, the library that contains most ready-made executable code is not compatible with gcc compiler ... so to use this compiler in Windows you need a different library: that's where MinGW enters. MinGW provides, among other things, the library(ies) needed for making a C implementation together with gcc.
The Windows library and MSVC together make a different implementation.
MinGW is a suite of development tools that contains GCC (among others), and GCC is a C compiler within that suite.
MinGW is an implementation of most of the GNU building utilities, like gcc and make on windows, while gcc is only the compiler. Cygwin is a lot bigger and sophisticated package, wich installs a lot more than MinGW.
The only reason for existence of MinGW is to provide linux-like environment for developers not capable of using native windows tools. It is inferior in almost every respect to Microsoft tooolchains on Win32/Win64 platforms, BUT it provides environment where linux developer does not have to learn anything new AND he/she can compile linux code almost without modifications. It is a questionable approach , but many people find that convenience more important than other aspects of the development .
It has nothing to do with C or C++ as was indicated in earlier answers, it has everything to do with the environment developer wants. Argument about GNU toolchains on windows and its nessessety, is just that - an argument
GCC - unix/linux compiler,
MinGW - approximation of GCC on Windows environment,
Microsoft compiler and Intel compiler - more of the same as names suggest(both produce much , much better programs on Windows then MinGW, btw)

Crosscompiler Binary compatibility in C

I need to verify something for which I have doubts. If a shared library ( .dll) is written in C, with the C99 standard and compiled under a compiler. Say MinGw. Then in my experience it is binary compatible and hence useable from any other compiler. Say MS Visual Studio. I say in my experience because I have tried it successfully more than once. But I need to verify if this is a rule.
And in addition I would like to ask if it is indeed so, then why libraries written completely in C, like openCV for example don't provide compiled binaries for every different OS? I know that the obvious reason would be to set all the compile-time parameters, but other than that there is none right?
EDIT: I am adding an additional question which I see as a logical extension to the original. Isn't this how one would go and create a closed source library? Since the option of giving source goes out of the window there, giving binaries is the only choice. And in that case providing binaries for as many architectures as possible is the desired result, with C being an obvious choice for having the best portability between systems and compilers. Right?
In the specific case of C compilers (MSVC and GCC/MinGW) in the Windows world, you are correct in the assumption of binary compatibility. One can link a C interface DLL compiled by GCC to a program in Visual Studio. This is the way C99 projects like ffmpeg allow developers to write application wiht Visual Studio. One only needs to create the import library with lib.exe found in the Microsoft toolchain from the DLL. Or vice versa, using mingw.org's pexports or better, mingw-w64's gendef tool, one can create a GCC import lib for a MSVC produced DLL.
This handy interoperability breaks down when you enter the C++ interface world, where the ABI of MSVC and GCC is different and incompatible. It may work, it may not, no guarantees are made and no effort is (currently) being done in changing that. Also, debugging info is obviously different, until someone writes a debug information generator/writer in GCC that is compatible to MSVC's debugger (along with gdb support of course).
I don't think C99 specifically changes anything to function declarations or the way arguments are handled in symbol definitions, so there should be no problem here either.
Note that as Vijay said, there is still the architecture difference, so a x86 library can't be used when linking to an AMD64 library.
To also answer your additional question about closed source binaries and distributing a version for all available compilers/architectures.
This is exactly the way you would create a closed source binary. In addition to the import library, it is also very important to hide exports from the DLL, making the DLL itself useless for linking (if you don't want client code to use private functions in the library, see for example the output of dumpbin /exports on a MSOffice DLL, lots of hidden stuff there). You can achieve the same thing with GCC (I believe, never used or tried it) using things like __attribute(hidden) etc...
Some compiler specific points:
MSVC comes with four (well, actually only three remaining in newer versions) different runtime libraries through /MT, /MD, and /LD. On top of this, you would have to provide a build for each version of Visual Studio (including Service Packs) to assure compatibility. But that is closed source binary and Windows for you...
GCC does not have this problem; MinGW always links to msvcrt.dll provided by Windows (since Windows 98), equivalent with /MD (and maybe also a debug library equivalent with /MDd). But I there are two versions of MinGW (mingw.org and mingw-w64) which do not guarantee binary compatibility. THe latter is more complete as it provides 64-bit options as well as 32-bit, and provides a more complete header/library set (including a substantial part of DirectX and DDK).
The general rule is that IF your OS/CPU combination has a standard ABI, and IF that ABI is powerful enough for your language, most compilers will follow that ABI and as a result will be binary compatible, allowing you to link libraries (shared or static) compiled with different compilers to programs compiled with other compilers just fine.
The problem is that most ABIs are fairly weak -- they're designed around low-level languages like C and FORTRAN and date back to the days before object oriented languages like C++. So they tend to lack support for things like function overloading, user-defined operators, exceptions, global contructors and destructors, virtual functions, inheritance, and such that are needed by C++.
This lack was recognized when C++ was designed which is why C++ has extern "C" -- which causes the compiler to limit itself to the standard ABI for certain functions, while disabling all the extra C++ features that the ABIs generally don't support.
A shared library or dll compiled to a particular architecture can be linked to applications compiled by other compilers that target the same architecture. (By architecture, I mean a processor/OS combination). But it is not practical for a library developer to compile against all possible architectures. Moreover, when a library is distributed in source form, users can build binaries optimized to their specific requirements.

Resources