My app uses only functions supported in OpenSSL 0.9.8 and later, but I compile it on a system with a 1.0.0 library installed (with -lcrypto), and the app requires libcrypto.so.1.0.0 or later at install time.
OpenSSL apparently compiles the entire version level "libcrypto.so.1.0.0" into the SONAME of the library, so my app won't run unless this specific version of the library exists. I get that it won't run on a system with only 0.9.8 installed, but what if 1.0.1 is installed?
For every other shared library I use (-lpthreads, -lncurses, ..), ldd shows the SONAME as "libxxx.so.N", so I only need version N installed. OpenSSL is the only library I'm aware of that depends on a very specific version level (V.R.M), so I worry that the app won't run if the installed library is later than libcrypto.so.1.0.0 (or the library is updated to a more recent level).
Is there a way to compile my app to use "libcrypto.so or "libcrypto.so.1" regardless of what version it's linked to? And why does OpenSSL use the full version in the SONAME when no other library I'm aware of does this?
No, there is no reliable way because in OpenSSL the internal structures keep changing between versions, and quite a few functions are actually macros which access/manipulate the structure members directly.
Still if you are SURE your application does not use any such macros and willing to take the risk the structures may change and your app won't work, you can dlopen() the libcrypto.so and dlsym() the functions you use. There would be around 20 of them. Keep in mind that many functions you may be using, such as SSL_CTX_set_options and SSL_want_read/SSL_want_write are actually macros which result in calling the same functions.
Another option would be to link statically with libcrypto.a and libssl.a. This would also make your app to run on systems which do not have OpenSSL installed at all (although those are few). Expect this to add 300-900kb to the size of your app though.
Related
I developed a C program requiring some dynamic libraries, most notably libmysqlclient.so, which I intent to run on some remote-hosts. It seems like I have the following Options for distribution:
Compile the program static.
Install the required dependencies on the remote host
Distribute the dependencies with the program.
The first option is problematic as I need glibc-version at runtime anyway (since I use glibc and libnss for now).
I'm not sure about the second option: Is there a mechanism which checks if a installed library-version is sufficient for a program to run (beside libxyz.so.VERSION). Can I somehow check ABI-compatibility at startup?
Regarding the last Option: would I distribute ALL shared-libraries with the binary, or just the one which are presumably not installed (e.g libmysqlclient, but not libm).
Apart form this, am I likely to encounter ABI-compatibility problems if I use a different compiler for the binary then the one the dependencies were build with (e.g binary clang, libraries gcc)?
Version checking is distribution-specific. Usually, you would package your application in a .deb or .rpm file using the target distribution's packaging tools, and ship that to users. This means that you have to build your application once for each supported distribution, but there really is no way around that anyway because different distributions have slightly different versions of libmysqlclient. These distribution build tools generate some dependency version information automatically, and in other cases, some manual help is needed.
As a starting point, it's a good idea to look at the distribution packaging for something that relies on the MySQL/MariaDB client library and copy that. Maybe inspircd in Debian is a good example.
You can reduce the amount of builds you need to create and test somewhat by building on the oldest distribution versions you want to support. But some caveats apply; distributions vary in the degree of backwards compatibility they provide.
Distributing dependencies with the program is very problematic because popular libraries such as libmysqlclient are also provided by the base operating system, and if you use LD_LIBRARY_PATH to inject your own version, this could unintentionally extend to other programs as well (e.g., those you launch from your own program). The latter risk is still present even if you use DT_RUNPATH (via the -rpath linker option), although it is somewhat reduced.
A different option is to link just application-specific support libraries statically, and link base operating system libraries dynamically. (This is what some software collections do.) This does not seem to be such a great choice for libmysqlclient, though, because there might be an expectation that its feature set is identical to the distribution (regarding the TLS library and available configuration options), and with static linking, this is difficult to achieve.
I am using a shared C library on Linux that is distributed in binary form. The problem is that the dependencies are set to require exactly the versions available on the development machine. For example, each release requires the (at the time) latest glibc and only the exact version of libreadline on their system.
I have contacted the developers and they don't know what to do about this. As far as I can tell, they are not consciously using the latest features, so the library should continue to work with older dependencies. I think they are using gcc on Linux, but they are also using a complex make system to control other compilers to build for Windows and Unix.
How and to what extent can you manage the build process so that a library requires dependencies just of a sufficient version and will also accept later versions?
This was a related question.
Edit: To be clear, I want to know how to build programs so they will accept dependencies with a specific version number or later numbers. Whether the developers compile it or I do, I want to be able to distribute a binary that does not require exactly the versions of dependencies present in the build environment.
Edit 2: After rephrasing the question, I realized this has been covered many times before. Some of the best Q&A:
Deploying Yesod to Heroku, can't build statically
Compile with older libc
Linking against an old version of libc
How can I link to a specific glibc version?
It's not very confidence inspiring. They should be building on a stable baseline release, it could just be a virtual install. Some versions of Linux, copy a build environment so packages aren't linked to updated library versions.
The openSUSE build service, lets devolopers build binary packages, for a wide variety of http://openbuildservice.org/about/
IIRC readline is a GPL program and checking at http://cnswww.cns.cwru.edu/php/chet/readline/rltop.html#Availability suggests it is GPL v 3 so they may be in violation of the GPL, if they are using libreadline functions and should provide you with the source to their library. I am not sure if you are meaning rpm/apt package dependencies, or their library is actually calling libreadline.
You can always extract files from rpm or apt packages, if necessary so avoiding software manager issues, caused by poor packaging.
I took over a fairly large C code. There are lots of legacy binaries that are requiring old version shared libraries. The server has never versions of those exact libraries. I could recompile or setup symbolic links that will connect older versions to new. Setting up symbolic links will take some time - is there any standard or smart way to do this? I am new to this and would appreciate any tips. This is all C and FreeBSD environment.
Thanks.
In general when updating legacy code with new libraries, it is best to perform a check by recompiling the source code against the new libraries and their includes. This will allow you to use the compiler to check for inconsistencies between the old and new libraries in areas such as data types, function signatures, etc.
By recompiling you also are able to check that the new libraries provide all of the dependencies that you need.
Finally, doing a recompile will help you check that you are in fact able to recompile and link everything and have all of the necessary components.
I would feel uncomfortable tying to take a short cut such as using symbolic links.
The shared-library version number is only supposed to be changed when the ABI changes. (Old versions of FreeBSD didn't quite get this right, and it's fixed in more recent versions but only for system libraries!) So the only way to make those applications work properly is to either recompile them, or supply the exact version of the shared library that they were linked against. For programs that only depend on old versions of the FreeBSD system libraries, you can installes the compat[45678]x packages, which provide the versions of the libraries supplied with the specified version of the OS -- but there are significant pitfalls:
1) If some of the libraries your application depends on are linked against newer versions of the standard libraries than your application itself is, the dynamic linker will give you two incompatible copies of the standard library, and things are not likely to work.
2) If your application loads external modules or plug-ins using dlopen(), all bets are off, because these modules are not versioned.
FreeBSD 8 and newer use symbol versioning for the C library and some other important system libraries, so those libraries should never change library version again and ABI compatibility will be preserved. Many third-party developers are not so careful, and will both break ABI without changing the library version, and change the library version without breaking the ABI, so you can't win. (Some developers don't read the documentation and think that the shared-library version number should be the same as the product's version number.)
I'm a bit naive when it comes to application development in C. I've been writing a lot of code for a programming language I'm working on and I want to include stuff from ICU (for internationalization and unicode support).
The problem is, I'm just not sure if there are any conventions for including a third party library. for something like readline where lots of systems are probably going to have it installed already, it's safe to just link to it (I think). But what about if I wanted to include a version of the library in my own code? Is this common or am I thinking about this all wrong?
If your code requires 3rd party libraries, you need to check for them before you build. On Linux, at least with open-source, the canonical way to do this is to use Autotools to write a configure script that looks for both the presence of libraries and how to use them. Thankfully this is pretty automated and there are tons of examples. Basically you write a configure.ac (and/or a Makefile.am) which are the source files for autoconf and automake respectively. They're transformed into configure and Makefile.in, and ./configure conditionally builds the Makefile with any configure-time options you specify.
Note that this is really only for Linux. I guess the canonical way to do it on Windows is with a project file for an IDE...
If it is a .lib and it has no runtime linked libraries it gets complied into you code. If you need to link to dynamic libraries you will have to assure they are there provide a installer or point the user to where they can obtain them.
If you are talking about shipping your software off to end users and are worried about dependencies - you have to provide them correct packages/installers that include the dependencies needed to run your software, or otherwise make sure the user can get them (subject to local laws, export laws, etc, etc, etc, but that's all about licensing).
You could build your software and statically link in ICU and whatever else you use, or you can ship your software and the ICU shared libraries.
It depends on the OS you're targeting. For Linux and Unix system, you will typically see dynamic linking, so the application will use the library that is already installed on the system. If you do this, that means it's up to the user to obtain the library if they don't already have it. Package managers in Linux will do this for you if you package your application in the distro's package format.
On Windows you typically see static linking, which means the application bundles the library and it will use that specific version. many different applications may use the same library but include their own version. So you can have many copies of the library floating around on your system.
The problem with shipping a copy of the library with your code is that you don't get the benefit of the library's maintainers' bug fixes for free. Obscure, small, and unsupported libraries are generally worth linking statically. Otherwise I'd just add the dependency and ensure that whatever packages you ship indicate it appropriately.
What is the utility of devel packages like "libgtk+-devel" or "python-devel" etc.? Do they contain source of the library? How is it different from non-devel packages like libgtk+?
The *-devel packages (usually called *-dev in Debian-based distributions) are usually all the files necessary to compile code against a given library.
For running an application using the library libfoo only the actualy shared library file (*.so.*, for example libfoo.so.1.0) are needed (plus possibly some data files and some version-specific symlinks).
When you actually want to compile a C application that uses that library you'll need the header files (*.h, for example foo.h) that describe the interface of that application as well as a version-less symlink to the shared library (*.so, for example libfoo.so -> libfoo.so.1.0). Those are usually bundled in the *-devel packages.
Sometimes the *-devel packages also include statically compiled versions of the libraries (*.a, for example libfoo.a) in case you want to build a complete stand-alone application that doesn't depend on dynamic libraries at all.
Other languages (such as Java, Python, ...) use a different way of noting the API of a library (effectively including all the necessary information in the actual library) and thus usually need no separate *-devel packages (except maybe for documentation and additional tools).
They usually contain necessary headers and libraries. For example, python-devel will provide the Python headers and libraries that you need if you want to embed the Python interpreter in your own application. Some additional tools and documentation are included, too (e.g. a developer manual or code examples).