I tried this command to configure thrift,
./configure CXX=arm-linux-gnueabi-g++ CC=arm-linux-gnueabi-gcc --prefix=/arms/thrift --host=arm-linux-gnueabi --with-cpp --with-boost=/path-to-boost-for-arm
and met the following error msg
checking for boostlib >= 1.40.0... yes checking for libevent >= 1.0...
configure: error: in `/arms/thrift-0.9.0': configure: error: cannot
run test program while cross compiling
Is there any solution?
You get the error because a dependency it's trying to find is missing. So first cross compile all the dependency it's searching for.
./configure --help
Here you find how to include dependencies.
--with-(dependency)=path-to-compiled-bin
Thrift 0.9.0 is BROKEN for cross-compile. Part of the problems you're seeing are because they have static paths for at least a few (if not all) of the stuff that doesn't offer pkg-config stuff answers for things. It's looking outside of your sysroot for all sorts of things right now.
There's an issue logged in their Jira, but the position they take is "have you set your --includedir parameter?" (Uh, --includedir is specifying where things are within my sysroot, and you're supposed to honor things like turning off PHP builds (it doesn't right now...sigh...) and a --with-libtool-sysroot that prefixes everything so you can cross-compile.) So, I don't think help will be forthcoming anytime in the immediate future.
Related
I am trying to compile a C code on Oracle Linux 7.2 which is hosted as VM on windows 10.
Name of file run: configure
Name of log file: confg.log
Error where I am stuck
gcc: error: unrecognized command line option '-V'
As per my understanding of the code structure so far, there is a file named configure which is having compilation related commands and this file generates Makefile.am which further generates Makefile.in and at last Makefile.
Please help me in solving the error and also let me know if my understanding about the configure and makefiles is incorrect
configure scripts explore the environment in which a program is to be built. They then accordingly adjust tools called, options used and libraries linked, among other things. Some of the information is obtained by trying to execute programs with certain options; failure of a program to run is the intended way of obtaining the information that the given program is not available or does not take those options. Therefore it is not necessarily an error if one of the things doesn't work and produces an error; it may be one of the legitimate outcomes, and the (error, here) exit code of the compiler will be used to modify the Makefile accordingly — for example by omitting -V ;-).
Does the configure script actually stop there, or are you just observing the error in the log file? If you search for gcc -V on the web you'll find examples of configure scripts failing actually later (for unrelated reasons) which have the same "-V error" line in it. Could that be the case? I would assume that errors which actually cause configure to stop and not produce a Makefile should be visible on the command line, not only in the log file.
As an aside it is worthwhile to run ./configure --help and look through the options. Some may improve the build process or the result; for example you can usually tell configure that you are using gcc, gnu ld and so on, or that you don't need certain features (like X25 ;-) ).
You should look into the makefile of your project, identify where the misspelled -V option is and replace it with -v (lowercase). As pointed out by others in the comments -V is not a compiling flag, but gives back the compiler's version.
I'm about to implement AODV on ARM board SabreLite and I'm facing some problems.
So, I use the latest version of AODV located here (sourceforge.net/projects/aodvuu/). I've follow the instruction given in README file but at the end, i get the error:
kaodv-mod.c:22:27: fatal error: linux/version.h: No such file or directory
#include
Since the board use 3.0.35 kernel version, i download it and I just change the kernel directory in Makefile. And, it should normally worked based on the instruction (http://w3.antd.nist.gov/wctg/aodv_kernel/kaodv_arm.html). The above error suggests that i don't have the version.h but I checked and I have all of linux header files installed, so it can't be that.
On the step number 6 of the tutorial (README file), i did not compile the kernel 3.0.35 because i'm pretty positive that it has the proper netfilter support for AODV-UU as it is a kernel young version. ( It is actually a configuration suggestion on kernel 2.4 and 2.6 but i think i should not obliged to do that here)
What can be the solution of this ?
Do i really need to compile this kernel version (3.0.35) before keep going ?
Do i have to change the AODV code, and if so, which files do i have to modify ?
Thanks in advance !!!
Thanks for your response, but unfortunately, i've already done that. By saying that, i mean, i've choosen the kernel source tree that matches the target kernel (linux-imx6-boundary-imx_3.0.35_4.1.0). I've also set up my cross compiler to have my environment variables ready for the cross compilation. Here is the output.
echo $CC:
arm-oe-linux-gnueabi-gcc -march=armv7-a -mthumb-interwork -mfloat-abi=hard -mfpu=neon -mtune=cortex-a9 --sysroot=/usr/local/oecore-x86_64/sysroots/cortexa9hf-vfp-neon-oe-linux-gnueabi
and some of my env variables looks like this:
ARCH=arm
CROSS_COMPILE=arm-oe-linux-gnueabi-
CFLAGS= -O2 -pipe -g -feliminate-unused-debug-types
RANLIB=arm-oe-linux-gnueabi-ranlib
After, all of these configurations, i still got the error. I really don't think that i have to recompile the kernel
In order to build modules, you need a kernel source tree in a state that matches the target kernel, i.e. not an untouched freshly-downloaded one. Don't confuse the presence of extra board-specific patches/drivers/etc. in a vendor kernel for configuration - to get the source tree into the right state to use, you still need to:
configure it correctly: make ARCH=arm <whatever>_defconfig (and/or any .config tweaks your board needs)
then build it: make ARCH=arm CROSS_COMPILE=<your toolchain triplet>
You need to actually build the kernel because there are many important files that don't exist yet, like the contents of include/generated (where the aforementioned version.h is created), the corresponding arch/$ARCH/include/generated, the checksums for module versioning, and probably more, which will all be different depending on which architecture and particular configuration options were chosen.
My bad for missing the mention of the crucial detail in the question, but upon downloading the linked AODV to try this myself, it became clear: the makefile is designed for the 2.4 build system which was rather different (and I'm not familiar with). Getting that one to build against a post-2.6 kernel will require writing a new makefile.
I have been following the instructions here to compile an avr cross compiler toolchain on my x86 PC. I had to do some searching on the step for compiling gcc because it was complaining about some missing dependencies, turns out there was a a script I could just run which downloaded the pre-requisites.
Now however, I am stuck on the step where I'm supposed to compile avr-libc. I get this error:
blah#blah-computer:~/avr/avr-libc-1.8.0$ ./configure --prefix=$PREFIX --build='./config.guess' --host=avr
checking build system type... Invalid configuration `./config.guess': machine `./config.guess' not recognized
configure: error: /bin/bash ./config.sub ./config.guess failed
From my googling of this problem, it seems there might be something wrong with my installation of autotools, and I actually remember seeing some warning about autotools when I was compiling gcc but as far as I could tell that step was actually successful.
Could anybody give me a good quick explanation of what autotools is and maybe help me fix my problem?
I don't see why you would need to specify --build at all. Just set the environment variable: CC=avr-gcc (the AVR cross compiler), and use --prefix=... and --host=avr as before. AVR-libc will set the CFLAGS to the 'best' values as it builds each ISA version in the multi-lib.
However, if you want to specify --build, use the backtick characters rather than the single-quote characters to execute the script: --build=`./config.guess`
I'm trying to compile a simple hello world application to be run on uCLinux (2.4) which is running on a board with a Freescale Coldfire (MCF5280C) processor...and I'm not quite sure what to do here.
I know I need to compile with the correct version/tools from Freescale to target this hardware, so I downloaded and installed the Coldfire tool chain and verified that one I have is for my target:
mike#linux-4puc:/usr/local/m68k-elf/bin> ./gcc -v
Reading specs from /usr/local/lib/gcc-lib/m68k-elf/2.95.3/specs
gcc version 2.95.3 20010315 (release)(ColdFire patches - 20010318 from http://fiddes.net/coldfire/)(uClinux XIP and shared lib patches from http://www.snapgear.com/)
I tried a simple gcc "file" type command:
mike#linux-4puc:/home/mike> /usr/local/m68k-elf/bin/gcc test.c
/usr/local/m68k-elf/bin/ld.real: cannot open crt0.o: No such file or directory
collect2: ld returned 1 exit status
Which does not work at all.. so it's clearly more complex that than. The output almost looks like it wants me to build the tool chain before I use it?? Anyone ever done this before? Not sure what I need to do or if I just need some flags.
You might also try seeing if you have a command called m68k-elf-gcc or something along those lines. This is a common naming for cross-compilers.
As for your problem, it sounds like there is something wrong with your compiler setup. crt0.o is the object file that contains C-runtime setup code. The linker (what is actually giving the error) should know where this file is if setup properly.
When you installed you should have run make install as the last step without having modified anything since the make step. The configuration step will setup certain variables and such based on the path where it's supposed to be installed.
Where did you get a FreeScale toolchain? I took a look at their site and it seemed only third parties supplied C++ cross-compilers. In the toolchain I get from NetBurner (for use with their hardware) the crt0.o file exists under the gcc-m68k\m68k-elf\lib directory.
I'm on a network where I don't have root access, so everything I install is under a prefix ~/bin (actually referenced by its full path).
So I have openbox working fine, which is what I'm using to send this from. Imlib2 I do ./configure --prefix=~/bin; make; make install.
Then I run from the tint2 source directory
IMLIB2_CFLAGS=-i~/bin/include/Imlib2.h *only typoed here
export IMLIB2_CFLAGS
IMLIB2_LIBS=-l~/bin/lib/libImlib2.a
export IMLIB2_LIBS
./configure --prefix=~/bin
which leaves me with this charming message
checking for IMLIB2... yes
checking for imlib_context_set_display in -lImlib2... no
configure: error: Imlib2 must be built with X support
Edit:
So Imlib2 is now compiled --with-x and installed to the location I'm referencing. I am still receiving an identical error message.
I'm guessing it's because I don't know what the flags are for the initial configure of imlib2?
Probably, yes. ./configure --help will usually give you advice on what to do (i.e., how to pass the correct information to the configure script; but you'll need to find out what that information is a la imlib2).
If your Q is accurate, you should fix the spelling of CLFAGS in the first line.
More generally, you could use:
CPPFLAGS=-I~/bin/include LDFLAGS=-L~/bin/lib ./configure ...
However, as the accepted answer suggests, there is often a direct way to specify the location of pre-requisite software packages.
See also: Linking with a different .so file in Linux.