Compiling C code into a .bin file in Visual Studio 2012 Express - c

I was wondering if there was a way to set the compiler to compile my code into a .bin file which only has the 1's and 0's, no hex code as in a .exe file. I want the code to run on the processor, not the operating system. Are there any settings to set it to that in the Express edition?? Thanks in advance.

There is nothing magic about a ".bin" file. The extension generally just indicates a binary file, but all files are binary. So you can create a ".bin" file by renaming the ".exe" file that your linker generates to ".bin".
I presume you won't be satisfied with that, so I'll elaborate a little further. The ".exe" file extension (at least on Windows, which I'll assume since you've added a Visual Studio-related tag) implies a binary file with a special format—a Portable Executable, or PE for short. This is the standard form of binary file used on Windows operating systems, both for executables and DLLs.
So a PE file is a binary (".bin") file, but an unknown binary file with a ".bin" extension is not necessarily a PE file. You could have taken some other binary file (like an image) and renamed it to have a ".bin" extension. It just contains a sequence of binary bits in no particular format. You won't be able to execute the file because it's not in the correct, recognized format. It's lacking the magic PE header that makes it executable. There's a reason that C build systems output PE files by default: that's the only type of file that's going to be of any use to you.
And like user1167662 says in his comment, there is nothing magical about hex code. Code in binary files can be represented in either hex or binary format. It's exactly the same information either way. Any good text editor (at least, one designed for programmers), can open and display the contents of a file using either representation (or ASCII, or decimal).
I want it to be as low level as possible for optimal performance.
There is nothing "lower level" about it, and you certainly won't get any optimized performance. PE files already contain native machine code that runs directly on your microprocessor. It's not interpreted like managed code would be. It contains a series of instructions in your processor's machine language. PE files just contain an additional header that allows them to be recognized and executed by the operating system. This has no effect on performance.
To build an operating system.
Now, that's a bit different… In particular, it's going to be a lot more difficult than writing a regular Windows application. You have a lot of work ahead of you, because you can't rely on the operating system to do anything to help you out. You'll need to get down-and-dirty with the underlying hardware that you're targeting—a developer's guide/manual for your CPU will be very useful.
And you'll have to get a different build environment. Visual Studio is not going to do you any good if you're not creating a PE file in the recognized format. Neither is Microsoft's C++ linker included with it, link.exe. The linker doesn't support outputting "flat" binary files (i.e., those with the PE header stripped off). You're going to need a different linker. The GCC toolset can do this. There is a Windows port; it is called MinGW.
I also recommend a book on operating system development. It's too much to cover in an answer to a Stack Overflow question. And for learning purposes, I strongly suggest playing with an architecture other than Intel's x86.

Related

How to check if a object code is 16/32 bit?

Is there any way by which we can identify that a .obj file and .exe file is 16/32 bit?
Basically I want to create a smart linker, that will automatically identify which linker do the given file names need to be passed to.
Preferred Language: C (it can be different, if needed)
I am looking for some solution that can read the bytes of an .exe/the code of an .obj file and then determine if it's 16/32 bit. Even an algorithm would too do.
Note: I know both object code and a executable are two different entities.
All of this information is encoded in the binary object according to the relevant Application Binary Interface (ABI).
The current Linux ABI is the Executable and Linkable Format (ELF), and you can query a specific binary file using a tool such as readelf or objdump.
The current Windows ABI is the Portable Executable (PE) format. I'm not familiar with the toolset here but a quick google search suggests there are programs that function the same as readelf:
http://www.pe-explorer.com/peexplorer-tour.htm
Here's the Microsoft specification of the PE format:
https://learn.microsoft.com/en-us/windows/win32/debug/pe-format
However, neither of those formats support 16-bit binaries anymore. The older ABI format is called "a.out" for Linux, which can be read and queried with objdump (I'm not sure about readelf). The older Windows/DOS formats are called MZ and NE. Again, I'm not familiar with the tool support for these older Windows formats.
Wikipedia has a pretty comprehensive list of all the popular executable file formats that have been used, with links to more info:
https://en.wikipedia.org/wiki/Comparison_of_executable_file_formats

How do I compile C code to a raw os-less binary?

Considering that C is a systems programming language, how can I compile C code into raw x86 machine code that could be invoked without the presence of an operating system? (IE: You can assume I have a boot sector that loads the raw machine code from disk into memory then jumps directly to the first instruction).
And now, for bonus points: Ideally, I'd like to compile using Visual Studio 2010's compiler because I've already got it. Failing that, what's the best way to accomplish the task, without having to install a bunch of dependencies or having to make large sweeping configuration changes across my entire system? I'd be compiling on Windows 7.
Usually, you don't. Instead, you compile your code normally, and then (either with the linker or some other tool) extract a raw binary from the object file.
For example, on Linux, you can use the objcopy tool to copy an object file to a raw binary file.
$ objcopy -O binary object.elf object.binary
First off you dont use any libraries that require a system call (printf, fopen, read, etc). then you compile the C files normally. the major difference is the linker step, if you are used to letting the c compiler call the linker (or letting some gui do it) you will likely need to take over that manually in some form. The specific solution depends on your tools, you will need to have some bootstrap code (the small amount of assembly that is needed to cover the assumptions of C compilers and programmers and launch the entry point in your C program), and a linker script or the right command line options for the linker to control the address space for the binary as well as to link the objects together. Then depending on the output format of the linker you might have to convert it to some other binary format (intel hex, srec, exe, com, coff, elf, raw binary, etc) to be compatible with wherever it is going to be loaded or run.

C object file compatibility between computers

First I want to state for the record that this question is related to school/homework.
Let’s say computers CP1 and CP2 both share the same operating system and machine language. If a C program is compiled on CP1, in order to move it to CP2, is it necessary to transfer the source code and recompile on CP2, or simply transfer the object files.
My gut answer is that the object files should suffice. The C code is translated into assembly by the compiler and assembled into machine code by the assembler. Because the architecture shares the same machine code and operating system, I don't see a problem.
But the more I think about it, the more confused I’m starting to get.
My questions are:
a) Since its referring to object files and not executables, I’m assuming there has been no linking. Would there be any problems that surface when linking on CP2?
b) Would it matter if the code used C11 standard on CP1 but the only compiler on CP2 was C99? I'm assuming this is irrelevant once the code has been compiled/assembled.
c) The question doesn't specify shared/dynamic linked libraries. So this would only really work if the program had no dependencies on .dll/.so/ .dylib files, or else these would be required on CP2 as well.
I feel like there are so many gotchas, and considering how vague the question is I now feel that it would be safer to simply recompile.
Halp!
The answer is, it depends. When you compile a C program and move the object files to link on a different computer, it should work. But because of factors such as endianness or name mangling, your program might not work as intended, and even might crash when you try to run it.
C11 is not supported by a C99 compiler, but it does not matter if the source has been compiled and assembled.
As long as the source is compiled with the libraries on one machine, you don't need the libraries to link or run the file(s) on the other computer (static libraries only, dynamic libraries will have to be on the computer you run the application on). This said, you should make the program independent so you don't run into the same problems as before where the program doesn't work as intended or crashes.
You could get a compiler that supports EABI so you don't run into these problems. Compilers that support the EABI create object code that is compatible with code generated by other such compilers, thus allowing developers to link libraries generated with one compiler with object code generated with a different compiler.
I have tried to do this before, but not a whole lot, and not recently. Therefore, my information may not be 100% accurate.
a) I've already heard the term "object files" being used to refer to linked binaries - even though it's kinda inaccurate. So maybe they mean "binaries". I'd say linking on a different machine could be problematic if it has a different compiler -
unless object file formats are standardized, which I'm not sure about.
b) Using different standards or even compilers doesn't matter for binary code - if it's linked statically. If it relies on functions from a dynamic lib, there could be problems. Which answers c) as well: Yes, this will be a problem. The program won't start if it doesn't have all required dynamic libs in the correct version. Depends on linking mode (static vs. dynamic), again.
Q: Let’s say computers CP1 and CP2 both share the same operating system and machine language.
A: Then you can run the same .exe's on both computers
Q: If a C program is compiled on CP1, in order to move it to CP2, is it necessary to transfer the source code
A: No. You only need the source code if you want to recompile. You only need to recompile if it's a different, incompatible CPU and/or OS.
"Object files" are generally not needed at all for program execution:
http://en.wikipedia.org/wiki/Object_files
An object file is a file containing relocatable format machine code
that is usually not directly executable. Object files are produced by
an assembler, compiler, or other language translator, and used as
input to the linker.
An "executable program" might need one or more "shared libraries" (aka .dll's). In which case the same restrictions apply: the shared libraries, if not already resident, must be copied along with the .exe, and must also be compatible with the CPU and OS.
Finally, "scripts" do not need to be recompiled. You may copy the script freely from computer to computer. But each computer must have an "interpreter" to run the script: a Perl script needs a Perl interpreter, a Python script a python interpreter, and so on.

Any way to decompile binary resource file built with ancient compiler?

I'm trying to resurrect a 1990's application that was built with Borland Turbo C++ (version unknown, maybe 3.0, maybe 4.5?), and apparently targeted for Windows 3.1.
The project contains a single .c file, and a single .res file. Rather than try to locate the ancient compiler, I've tweaked the C source into compatibility with MinGW gcc ver 4.5.2, thinking I could rebuild it for win32. Unfortunately, this is one of those windows programs where the main window is a dialog box, and the dialog specifications are embedded in the .res file. Of course modern MinGW gcc doesn't understand the old .res format.
So is there a way to recover an .rc file from a 1990's vintage Borland .res file? I know there will be other problems compiling old 16 bit windows code like this, but I can deal with that later (it's only 2K loc), right now the stumbling block is this resource file.
somewhat later ..
I have found 'Turbo C 3.1', but this thing is a trip. It can actually compile for 16-bit windows, the resulting executables requiring an NTVDM to run under XP, but the concept is proved. Tried it on a simple windows hello-world, and it worked.
Anyway, the problem is still the .res file! There was a project (.prj) file with the aforementioned material, but it apparently calls out the .rc source file. I know with gcc, I can link an already compiled resource file into an executable, but heck if I can figure out the strange command line for 'bcc' to do it. To get an idea how odd it is, bcc uses -W as a flag to 'create windows application'. It must be possible. Anybody remember?
(fwiw- i think there may be better tags for this. feel free to re-tag.)
Open Watcom C/C++'s Resource Editor (wre.exe) seems to be able to open 16-bit .res files. If the latest version doesn't work fully (which isn't totally unexpected as very few people work with 16-bit resource files), try earlier ones.

Platform-dependent issue in run-length encoding of bmp files using C

I've written a program which opens a bmp file and treats it as a character file and performs run length encoding on it. It produces a valid compressed encoding file, which I read again to perform the decoding.
When i'd made the application i was using Fedora and it ran perfectly fine. Now i'm running it on ubuntu and it refuses to work.
Any idea what is wrong? I fear it has to do with the encoding.
I would first and foremost suggest using a source code debugger to find the problem.
Possible causes include using different compilers on the different systems, which might do different things with, for example, packing structs (e.g., BITMAPFILEHEADER). You also might have different CPU architectures on the two systems (64- vs. 32-bit).
You can also use a hex editor (e.g., XVI32) to examine the differences between BMP files generated by the two versions of your program.

Resources