Using libxml2 WITHOUT zlib - zlib

we are using libxml2 with zlib. There is an option to build libxml2 WITHOUT zlib as well. What is the difference? is performance affected?

Yes you can do it.
There is an explicit compiling configuration option for excluding zib.
from libxml2 FAQ
What other libraries are needed to compile/install libxml2 ?
Libxml2 does not require any other library, the normal C ANSI API
should be sufficient (please report any violation to this rule you may
find).
However if found at configuration time libxml2 will detect and use the
following libs:
libz : a highly portable and available widely compression library.
iconv: a powerful character encoding conversion library.
The option itself depends by the compilation environment.
Take a look at this article
http://www.tuan.nguoianphu.com/LibXML2_compile_for_Linux_Solaris_Windows
libxml uses zlib for reading or writing to compressed files directly. So if you do not need this functionality then the library can be removed.

Related

Using TinyCC (tcc) to generate a C wrapper for V

I am trying to find some basis I can use to generate wrappers/bindings for C libraries to be used from Vlang and whilst doing so, I remembered that initially, V uses TCC for it's bootstrap compilation.
Since TCC is a very, very capable C compiler, I wondered if it was possible to utilize this and make this a way to generate wrappers and bindings by using TCC's built in parser/lexer to generate a symbol table of structs, functions, enums and the like and then iterate over said table to generate V code.
Judging from reading tcc.h, the API described here is usable, but I wouldn't be surprised if it was declared internal and thus not fully documented. Where can I find more information about how I could use TCC as a plain parser?
I'm sure you've already found some information regarding this, but for posterity, here are some places with information about TCC and using it as a dynamic code generator:
The TCC Git Repo
The Gnu project page
The TCC Development Archive

Checking library version of shared library (so-file) in autoconf

Libraries typically have a package/release version as well as a library version. These have very different semantics.
Is there a standard way to check the availability of a library with a given library version using Autoconf, i.e., a macro in configure.ac?
Things that can be done, but are bad in my opinion (correct me if I am wrong):
Rely on the name of the postfix of the so-file. Weak verification.
Add artificial function in each version of the library and check for this in the standard Autoconf way. Ugly and assumes that I build the library.
Embed the lib version in the header file, check that this matches with grep-ish tricks, and assume that this matches the version in the lib. Risk of mismatch between header file and library and assumes I build the library.
What I want is a solution that inspects the actual library version
embedded in the library itself. I feel stupid, but I have googled
quite a lot and do not find anything.

Check build parameters of libxml2.so at runtime

I am writing a simple application that links to libxml2 on the system. It works for almost all users, but one user reported this error when reading a particular xml file from the web:
Unsupported encoding ISO8859-1
This error typically indicates that libxml2 was built --without-iconv. Is there any way I can explicitly test if the libxml2 dynamic library on the system has iconv support?
I can think of two ways to do this:
Write a short, simple test program that uses the iconv feature of xml. It should behave differently if it is not present. This is what the GNU configure software does - it tests for features being present by exercising them.
This is a hack - find a common iconv symbol present in libxml with iconv but not if iconv is missing. Use a utility like nm to list the symbols in the library file.
Or just avoid the issue by packaging a working libxml with your application.

Where to find stdio.h functions implementations?

I study C and I noticed that I can't find the implementation file for some header files like, for example, stdio.h which is a library which contains a lot of input/output functions, like printf. Where can I find its implementation?
Download one of these:
glibc
uclibc
dietlibc
BSD libc
Or, even better, download several of these and compare their implementations. Of course, these are likely doing a lot of things different compared to your particular standard library implementation, but would still be quite interesting for e.g. non-platform-specific functionality such as sprintf.
You need to find the source code for a C standard library like glibc: http://www.gnu.org/s/libc/
You can download the source here: http://ftp.gnu.org/gnu/glibc/
It contains source for all the library functions.
For example here. Google is your friend - just search for stdio.c. But note that you should handle these as "one implementation of many possible" - you don't know how your compiler is doing it by reading those, you just get an idea of how it can be done.
On Ubuntu or other OS that uses aptitude for package management, you can use:
apt-get source libc6
for example.
Also, running gcc in verbose mode will tell you the details of the paths it is using. This should help you find which include and library paths it's using.
gcc -v
or
gcc -Wl,--verbose
If you install the Windows SDK, there is an option to include the standard library source code, so you can also see how it is implemented on Windows.

Including third-party libraries in C applications

I'm a bit naive when it comes to application development in C. I've been writing a lot of code for a programming language I'm working on and I want to include stuff from ICU (for internationalization and unicode support).
The problem is, I'm just not sure if there are any conventions for including a third party library. for something like readline where lots of systems are probably going to have it installed already, it's safe to just link to it (I think). But what about if I wanted to include a version of the library in my own code? Is this common or am I thinking about this all wrong?
If your code requires 3rd party libraries, you need to check for them before you build. On Linux, at least with open-source, the canonical way to do this is to use Autotools to write a configure script that looks for both the presence of libraries and how to use them. Thankfully this is pretty automated and there are tons of examples. Basically you write a configure.ac (and/or a Makefile.am) which are the source files for autoconf and automake respectively. They're transformed into configure and Makefile.in, and ./configure conditionally builds the Makefile with any configure-time options you specify.
Note that this is really only for Linux. I guess the canonical way to do it on Windows is with a project file for an IDE...
If it is a .lib and it has no runtime linked libraries it gets complied into you code. If you need to link to dynamic libraries you will have to assure they are there provide a installer or point the user to where they can obtain them.
If you are talking about shipping your software off to end users and are worried about dependencies - you have to provide them correct packages/installers that include the dependencies needed to run your software, or otherwise make sure the user can get them (subject to local laws, export laws, etc, etc, etc, but that's all about licensing).
You could build your software and statically link in ICU and whatever else you use, or you can ship your software and the ICU shared libraries.
It depends on the OS you're targeting. For Linux and Unix system, you will typically see dynamic linking, so the application will use the library that is already installed on the system. If you do this, that means it's up to the user to obtain the library if they don't already have it. Package managers in Linux will do this for you if you package your application in the distro's package format.
On Windows you typically see static linking, which means the application bundles the library and it will use that specific version. many different applications may use the same library but include their own version. So you can have many copies of the library floating around on your system.
The problem with shipping a copy of the library with your code is that you don't get the benefit of the library's maintainers' bug fixes for free. Obscure, small, and unsupported libraries are generally worth linking statically. Otherwise I'd just add the dependency and ensure that whatever packages you ship indicate it appropriately.

Resources