Can you decompile a c dll to use pinvoke on or use reflector?
How do I get the method names and signatures?
Simply put there is no trivial way to do what you want. You can use a disassembler library such as distorm to disassemble the code around the exported entry points, though. There are some heuristics one can use, but many of those will only work with 32bit calling conventions (__stdcall and __cdecl) in particular. Personally I find the Python bindings for it useful, but libdasm can do the same.
Any other tool with disassembler capabilities will be of great value, such as OllyDbg or Immunity Debugger.
Note: if you have a program that already calls the DLL in question, it is most of the time very worthwhile to run that under a debugger (of course only if the code can be trusted, but your question basically implies that) and set breakpoints at the exported functions. From that point on you can infer a lot more from the runtime behavior and the stack contents of the running target. However, this will still be tricky - particularly with __cdecl where a function may take an arbitrary amount of parameters. In such a case you'd have to sift through the calling program for xrefs to the respective function and infer from the stack cleanup following the call how many parameters/bytes it discards. Of course looking at the push instructions before the call will also have some value, though it requires a little experience especially when calls are nested and you have to discern which push belongs to which call.
Basically you will have to develop a minimal set of heuristics matching your case, unless you have already licensed one of the expensive tools (and know how to wield them) that come with their own heuristics that have usually been fine-tuned for a long time.
If you happen to own an IDA Pro (or Hex-Rays plugin) license already you should use that, of course. Also, the freeware versions of IDA, although lagging behind, can handle 32bit x86 PE files (which includes DLLs, of course), but the license may be an obstacle here depending on the project you're working on ("no commercial use allowed").
You can use dependency walker.
http://www.dependencywalker.com/
You can find the exported function names with dumpbin or Dependency Walker. But to know how to call the functions you really need a header file and some documentation. If you don't have those then you will have to reverse engineer the DLL and that is a very challenging task.
Related
First off, I know that any such library would need to have at least some interface shimming to interact with system calls or board support packages or whatever (For example; newlib has a limited and well defined interface to the OS/BSP that doesn't assume intimate access to the OS). I can als see where there will be need for something similar for interacting with the compiler (e.g. details of some things left as "implementation defined" by the standard).
However, most libraries I've looked into end up being way more wedded to the OS and compiler than that. The OS assumptions seems to assume access to whatever parts they want from a specific OS environment and the compiler interactions practically in collusion with the compiler implementation it self.
So, the basic questions end up being:
Would it be possible (and practical) to implement a full C standard library that only has a very limited and well defined set of prerequisites and/or interface for being built by any compiler and run on any OS?
Do any such implementation exist? (I'm not asking for a recommendation of one to use, thought I would be interested in examples that I can make my own evaluations of.)
If either of the above answers is "no"; then why? What fundamentally makes it impossible or impractical? Or why hasn't anyone bothered?
Background:
What I'm working on that has lead me to this rabbit hole is an attempt to make a fully versioned, fully hermetic build chain. The goal being that I should be able to build a project without any dependencies on the local environment (beyond access to the needs source control repo and say "a valid posix shell" or "language-agnostic-build-tool-of-choice is installed and runs"). Given those dependencies, the build should do the exact same thing, with the same compiler and libraries, regardless of which versions of which compilers and libraries are or are not installed. Universal, repeatable, byte identical output is the target I'm wanting to move in the direction of.
Causing the compiler-of-choice to be run from a repo isn't too hard, but I've yet to have found a C standard library that "works right out of the box". They all seem to assume some magic undocumented interface between them and the compiler (e.g. __gnuc_va_list), or at best want to do some sort of config-probing of the hosting environment which would be something between useless and counterproductive for my goals.
This looks to be a bottomless rabbit hole that I'm reluctant to start down without first trying to locate alternatives.
I was looking at the NtDll export table on my Windows 10 computer, and I found that it exports standard C runtime functions, like memcpy, sprintf, strlen, etc.
Does that mean that I can call them dynamically at runtime through LoadLibrary and GetProcAddress? Is this guaranteed to be the case for every Windows version?
If so, it is possible to drop the C runtime library altogether (by just using the CRT functions from NtDll), therefore making my program smaller?
There is absolutely no reason to call these undocumented functions exported by NtDll. Windows exports all of the essential C runtime functions as documented wrappers from the standard system libraries, namely Kernel32. If you absolutely cannot link to the C Runtime Library*, then you should be calling these functions. For memory, you have the basic HeapAlloc and HeapFree (or perhaps VirtualAlloc and VirtualFree), ZeroMemory, FillMemory, MoveMemory, CopyMemory, etc. For string manipulation, the important CRT functions are all there, prefixed with an l: lstrlen, lstrcat, lstrcpy, lstrcmp, etc. The odd man out is wsprintf (and its brother wvsprintf), which not only has a different prefix but also doesn't support floating-point values (Windows itself had no floating-point code in the early days when these functions were first exported and documented.) There are a variety of other helper functions, too, that replicate functionality in the CRT, like IsCharLower, CharLower, CharLowerBuff, etc.
Here is an old knowledge base article that documents some of the Win32 Equivalents for C Run-Time Functions. There are likely other relevant Win32 functions that you would probably need if you were re-implementing the functionality of the CRT, but these are the direct, drop-in replacements.
Some of these are absolutely required by the infrastructure of the operating system, and would be called internally by any CRT implementation. This category includes things like HeapAlloc and HeapFree, which are the responsibility of the operating system. A runtime library only wraps those, providing a nice standard-C interface and some other niceties on top of the nitty-gritty OS-level details. Others, like the string manipulation functions, are just exported wrappers around an internal Windows version of the CRT (except that it's a really old version of the CRT, fixed back at some time in history, save for possibly major security holes that have gotten patched over the years). Still others are almost completely superfluous, or seem so, like ZeroMemory and MoveMemory, but are actually exported so that they can be used from environments where there is no C Runtime Library, like classic Visual Basic (VB 6).
It is also interesting to point out that many of the "simple" C Runtime Library functions are implemented by Microsoft's (and other vendors') compiler as intrinsic functions, with special handling. This means that they can be highly optimized. Basically, the relevant object code is emitted inline, directly in your application's binary, avoiding the need for a potentially expensive function call. Allowing the compiler to generate inlined code for something like strlen, that gets called all the time, will almost undoubtedly lead to better performance than having to pay the cost of a function call to one of the exported Windows APIs. There is no way for the compiler to "inline" lstrlen; it gets called just like any other function. This gets you back to the classic tradeoff between speed and size. Sometimes a smaller binary is faster, but sometimes it's not. Not having to link the CRT will produce a smaller binary, since it uses function calls rather than inline implementations, but probably won't produce faster code in the general case.
* However, you really should be linking to the C Runtime Library bundled with your compiler, for a variety of reasons, not the least of which is security updates that can be distributed to all versions of the operating system via updated versions of the runtime libraries. You have to have a really good reason not to use the CRT, such as if you are trying to build the world's smallest executable. And not having these functions available will only be the first of your hurdles. The CRT handles a lot of stuff for you that you don't normally even have to think about, like getting the process up and running, setting up a standard C or C++ environment, parsing the command line arguments, running static initializers, implementing constructors and destructors (if you're writing C++), supporting structured exception handling (SEH, which is used for C++ exceptions, too) and so on. I have gotten a simple C app to compile without a dependency on the CRT, but it took quite a bit of fiddling, and I certainly wouldn't recommend it for anything remotely serious. Matthew Wilson wrote an article a long time ago about Avoiding the Visual C++ Runtime Library. It is largely out of date, because it focuses on the Visual C++ 6 development environment, but a lot of the big picture stuff is still relevant. Matt Pietrek wrote an article about this in the Microsoft Journal a long while ago, too. The title was "Under the Hood: Reduce EXE and DLL Size with LIBCTINY.LIB". A copy can still be found on MSDN and, in case that becomes inaccessible during one of Microsoft's reorganizations, on the Wayback Machine. (Hat tip to IInspectable and Gertjan Brouwer for digging up the links!)
If your concern is just the need to distribute the C Runtime Library DLL(s) alongside your application, you can consider statically linking to the CRT. This embeds the code into your executable, and eliminates the requirement for the separate DLLs. Again, this bloats your executable, but does make it simpler to deploy without the need for an installer or even a ZIP file. The big caveat of this, naturally, is that you cannot benefit to incremental security updates to the CRT DLLs; you have to recompile and redistribute the application to get those fixes. For toy apps with no other dependencies, I often choose to statically link; otherwise, dynamically linking is still the recommended scenario.
There are some C runtime functions in NtDll. According to Windows Internals these are limited to string manipulation functions. There are other equivalents such as using HeapAlloc instead of malloc, so you may get away with it depending on your requirements.
Although these functions are acknowledged by Microsoft publications and have been used for many years by the kernel programmers, they are not part of the official Windows API and you should not use of them for anything other than toy or demo programs as their presence and function may change.
You may want to read a discussion of the option for doing this for the Rust language here.
Does that mean that I can call them dynamically at runtime through
LoadLibrary and GetProcAddress?
yes. even more - why not use ntdll.lib (or ntdllp.lib) for static binding to ntdll ? and after this you can direct call this functions without any GetProcAddress
Is this guaranteed to be the case for every Windows version?
from nt4 to win10 exist many C runtime functions in ntdll, but it set is different. usual it grow from version to version. but some of then less functional compare msvcrt.dll . say for example printf from ntdll not support floating point format, but in general functional is same
it is possible to drop the C runtime library altogether (by just using
the CRT functions from NtDll), therefore making my program smaller?
yes, this is 100% possible.
I have a large code written in C, but I did not write all of it myself. I wish to create an overview of the call structure in the code for reference. That is: I wish to know what (non-standard) functions are called by the different functions in the code, and thus create a hierarchy or a tree of the different functions. Are there any free, Unix compatible programs (that means no Visual Studio, but a Vim plugin or such would be neat) that can do this, or will I have to write something that can do this myself?
Doxygen does that too, it has to be enabled though.
For an overview of available tools see
http://en.wikipedia.org/wiki/Call_graph
There is a Vim plugin C Call-Tree Explorer called CCTree
http://www.vim.org/scripts/script.php?script_id=2368
As you mentioned a Vim plug-in, check out http://sites.google.com/site/vimcctree/. It uses CScope to generate the tree, so you will need to first generate a CScope db of your source files.
Have a look at http://www.gson.org/egypt/ This uses GCC to process the code and extracts the interdependencies within the program from the AST it emits.
gprof will do that. It also generates an execution profile, but in doing so it creates a call tree.
I just downloaded SourceTrail (https://github.com/CoatiSoftware/Sourcetrail/releases) and it did what I wanted, which was pretty close to what I think you want.
(What I wanted was to find out what routines called the function I was considering changing, or needed to understand).
Note that it is no longer maintained, but it did exactly what I wanted. It runs under Windows and Linux, and made finding who calls a function pretty trivial (as well as following that function's call tree down as needed). If you care, it has a GUI (is a GUI? whatever).
It does the parsing itself, but it didn't take very long to run, perhaps about the same time or a little less than compiling the code.
But if you want text only, or don't want to use a gui, or don't want to have it scan the code, this isn't for you.
(Notes - in my case, I was hyper-focused on one or 2 functions, and didn't care what system functions were being called. I spent some time stubbing out all the include files that were needed (since I ran the parse on one machine (A Linux machine) that didn't have all the include files needed for the Windows program I was looking at, and then did the exploration on a different (Windows) machine. Which, I should mention, worked perfectly. I just copied the entire source tree from my Linux machine to my Windows machine (which included the Sourcetail project file), loaded Sourcetail and had it load the project - done.)
I'm looking for a quick guide to basic dll hooking in windows with C, but all the guides I can find are either not C, or not windows.
(The DLL is not part of windows, but a third party program)
I understand the principle, but I don't know how to go about it.
I have pre-existing source code in C++ that shows what I need to hook into, but I don't have any libraries for C, or know how to hook from scratch.
The detours license terms are quite restrictive.
If you merely want to hook certain functions of a DLL it is often cheaper to use a DLL-placement attack on the application whose DLL you want to hook. In order to do this, provide a DLL with the same set of exports and forward those that you don't care about and intercept the rest. Whether that's C or C++ doesn't really matter. This is often technically feasible even with a large number of exports but has its limitations with exported data and if you don't know or can't discern the calling convention used.
If you must use hooking there are numerous ways including to write a launcher and rewrite the prepopulated (by the loader) IAT to point to your code while the main thread of the launched application is still suspended (see the respective CreateProcess flag). Otherwise you are likely going to need at least a little assembly knowledge to get the jumps correct. There are plenty of liberally licensed disassembler engines out there that will allow you to calculate the proper offsets for patching (because you don't want to patch the middle of a multi-byte opcode, for example).
You may want to edit your question again to include what you wrote in the comments (keyword: "DLL hooking").
loading DLLs by LoadLibrary()
This is well known bad practice.
You might want to look up "witch" or "hctiw", the infamous malware dev. there's a reason he's so infamous - he loaded DLLs with LoadLibrary(). try to refrain from bad practice like that.
I program in Delphi (D7 and D2006) on Windows XP (migrating in the near future to Windows 7). I need to use a mathematical library for some of the work I am doing and most of the math libraries (I am inclining towards Mathematica at present) I have looked at will produce compiled C code. Such code will provide specific functionality to my main programs.
I have a very basic question - given this development setup - how do I start utilising the compiled c code from Delphi? I really need baby steps to get me started on the process.
I've done quite a bit of this with my FE product OrcaFlex. You have two options to link to your C code from Delphi: static or dynamic. I link statically because it makes distribution and versioning much easier. But it's really quite a trick to get it to work statically and you have to rely on a number of undocumented aspects of Delphi.
I suspect that for your needs dynamic linking is best. Basically you need to compile and link your C code into a DLL. I recommend using the Borland C compiler to do this. You can use the free command line version BCC55 to do this. The advantage of using Borland C is that it makes the same assumptions about the 8087 floating point unit as Delphi does. If you build with MSVC then you will find that MS have elected not to raise floating point exceptions. Borland C does raise floating point exceptions. This is a bit of a corner case but it becomes relevant if you are trying to ship a product that you need to be robust.
You should know that the C code will, by default, use the C calling convention and I'd just stick with that. You bring it into Delphi by declaring the external routine as cdecl calling convention.
The other thing you need to take care on is defining a clear interface between the two modules. You need to make sure that exceptions don't cross the module boundary and that you don't pass any special types (e.g. Delphi strings) across the boundary. So for a string use a PChar (or even better PAnsiChar or PWideChar to be sure that it won't change meaning when you upgrade to Delphi 2009 and later).
I have been very happy with the SDL Library from Lohninger (http://www.lohninger.com/mathpack.html). It is written in Delphi and compiles right into your application, so there are no bundling or calling convention problems or floating point usage differences, as discussed by other responses in this thread.
Take a look at what he includes. If you're lucky, your needs will be met by his library and you'll be able to use it!
If you currently have Mathematica installed, go to the documentation centre and lookup guide/CLanguageInterface otherwise that guide is available on the web and have a good read there.
My understanding is that Mathematica can generate C-programs that link up with the Mathematica engine via MathLink if you need full function, or if you only need lower-level features then it is capable of generating code that can be statically linked with compiled Mathematica libraries. So that standalone code is possible.
See the Code Generator documentation.
If you can convert the C programs in to DLLs, then accessing such external functions from Delphi is relatively simple with external declarations.
function MathematicaRoutine(const x : double) : double; external 'MyInterface.dll';
There are bound to be a great number of complexities in getting this to work if you need to achieve a static bind, for use where Mathematica is not installed, if indeed it is possible. I have never attempted it.
You can mix your project with Delphi and C++ (Builder) code using RAD Studio. Put the automatically created C code into a C++ Builder file (.cpp) and for the rest add Delphi files.