Cross-compiled application does not run on Raspberry Pi - c

I'm just doing my first steps with Buildroot and Raspberry Pi (running Raspbian). But somehow I seem to do something wrong with the cross compilation. The application is a most simple Hello World program written in C. This is what I did:
Downloaded and installed buildroot
make raspberrypi2_defconfig
make toolchain
Then I wrote the tiny application and the following Makefile:
CROSS_BIN := /home/me/raspi/buildroot-2016.05/output/host/usr/bin
SYSROOT := /home/me/raspi/buildroot-2016.05/output/host/usr/arm-buildroot-linux-uclibcgnueabihf/sysroot
PATH := $(CROSS_BIN):$(PATH)
CC := arm-linux-gcc
CFLAGS := --sysroot=$(SYSROOT)
app: app.c
$(CC) $(CFLAGS) -o $# $<
Compiled the app and copied it to the Raspberry. When I tried to run it, RPI complains that it can't find the file (though it's there and executable for sure). The binary type seems ok to me and should fit to the CPU:
pi#raspberrypi:~ $ ./app
-bash: ./app: No such file or directory
pi#raspberrypi:~ $ ls -l app
-rwxr-xr-x 1 pi pi 4916 Jul 10 11:07 app
pi#raspberrypi:~ $ file app
app: ELF 32-bit LSB executable, ARM, EABI5 version 1 (SYSV), dynamically linked, interpreter /lib/ld-uClibc.so.0, not stripped
pi#raspberrypi:~ $ lscpu
Architecture: armv7l
Byte Order: Little Endian
CPU(s): 4
On-line CPU(s) list: 0-3
Thread(s) per core: 1
Core(s) per socket: 4
Socket(s): 1
Model name: ARMv7 Processor rev 5 (v7l)
CPU max MHz: 900.0000
CPU min MHz: 600.0000
Can somebody tell me what I'm doing wrong? If I compile the app natively and run it on the development host, it runs without a problem.

My money is on /lib/ld-uClibc.so.0 missing from your Rasperry pi. Am I right?
Okay, this library is your dynamic loader, it is responsible for loading your dynamic libraries at runtime. It will load the required shared libraries into the process address space and set the appropriate permissions on the memory (read only, read write and executable).
Your cross-compiler requires a loader that does not exist, probably due to mismatch between the installed image on the RPi and the cross-compilation environment (your sysroot).
There are several ways to fix it, let's start by examining a binary that works, try file /bin/ls and post the dynamic loader here.
For example:
$ file /bin/ls
/bin/ls: ELF 32-bit LSB executable, ARM, EABI5 version 1 (SYSV), dynamically linked, interpreter /lib/ld-linux-armhf.so.3, for GNU/Linux 2.6.32, BuildID[sha1]=5e052e8de057d379ab51d4af510ad9318fe77b46, stripped

When I tried to run it, RPI complains that it can't find the file (though it's there and executable for sure).
The shell is complaining that it cannot find a file in order to execute your program, not that it cannot find your file.
If it's installed use the strace command to determine what file cannot be found.
Most likely you have a dynamic library issue, e.g. you built with a uClibc toolchain, but your rootfilesystem has glibc.
Two common solutions:
(A) build your program with static linking (so that it no longer depends on the installed libraries of the target system).
$(CC) $(CFLAGS) -o -static $# $<
(B) rebuild the Buildroot toolchain to match the library that is already installed on your RPi. i.e. instead of a uClibc toolchain, build a glibc toolchain of matching version number.

Related

How to compile neon on ARM

I used the
arm-linux-gnueabi-g++ test.cpp -march=armv7-a -mfloat-abi=softfp -mfpu=neon -o test
on ubantu to get an excecutable file on ARM, but when I ran
adb push ./test /data/test
adb shell
cd data
chmod 777 test
./test
I got the following error:
./system/bin/sh: ./test: No such file or directory
I was confused about this.
If you intend to run the executable on Android (as it seems), you should ideally build it using the Android NDK. The problem is that your executable links to glibc which is available on normal linux systems, but not on Android. (In detail, the executable can't start because it requires the dynamic linker /lib/ld-linux.so.3 which isn't available on Android. In addition, it also requires glibc in the form of libc.so.6.)
If you build executables using the Android NDK, it will link to the Bionic libc, which is what is available on Android.
Alternatively, if you add -static when linking (in your case, in your single compile+link command), you'll get a static executable, which should work on both normal linux and Android.

Linking 32 bit files on 64 bit machine with ld

I am compiled my assembly files with nasm:
nasm -felf32 test.nasm -o test.o
I've done it on 64 bit OS. Now, I would like to link it with:
My OS is Ubuntu.
gcc -m32 test.o -o test
but I cannot because I have not installed 32-bit libraries.
I know that I should install 32bit libraries and it is a solution. But, I cannot install because I am not root. Can I solve that problem in another way? For example, to use ld and download need *.so files from the Internet?

Objdump doesn't recognize the architecture of a shared library

I built a shared library on Ubuntu 14.04 for ARM platform. The file has compiled and build successfully. I can inspect exported symbols with nm command but when I check .so file header I got the information that architecture is unknown.
Is this library built correctly, why is the library architecture unknown ?
objdump -f libMyLib.so
libMyLib.so: file format elf32-little
architecture: UNKNOWN!, flags 0x00000150:
HAS_SYMS, DYNAMIC, D_PAGED
start address 0x000033a0
You need to use the objdump binary provided by the toolchain of your target system (ARM), not from host system(x86_64).
As a example: I have setup a host system Linux x86_64 targeting openwrt mips, my toolchain folder has some files:
mips-openwrt-linux-gnu-ar
mips-openwrt-linux-gnu-as
mips-openwrt-linux-gnu-gcc
mips-openwrt-linux-gnu-ld
mips-openwrt-linux-gnu-objdump
mips-openwrt-linux-gnu-nm
These are the tools to manipulate programs for openwrt mips system, so instead of just calling objdump, I need to call ./mips-openwrt-linux-gnu-objdump -f <bin file> to read and get the proper output for the compiled file.

C - Unable to statically link to OpenSSL on Ubuntu

Following the instructions given here, I’ve downloaded the latest version of OpenSSL (openssl-1.0.1e.tar.gz) from here and installed it on Ubuntu v12.10 (32-bit).
I have a C project in Eclipse CDT (v1.2.0.201212170456) that statically links to the following two .a files:
home/usr/local/ssl/lib/libcrypto.a
home/usr/local/ssl/lib/ libssl.a
However when I build my project I get these errors:
/home/tashimaya/Applications/CodeSourcery/bin/../lib/gcc/arm-none-linux-gnueabi/4.4.1/../../../../arm-none-linux-gnueabi/bin/ld: skipping incompatible /usr/local/ssl/lib/libssl.a when searching for -lssl
/home/tashimaya/Applications/CodeSourcery/bin/../lib/gcc/arm-none-linux-gnueabi/4.4.1/../../../../arm-none-linux-gnueabi/bin/ld: cannot find –lssl
My toolchain is CodeSourcery (Sourcery G++ Lite 2010q1-202) and is for 32-bit OS.
What am I doing wrong?
Compiler command line I'm using:
arm-none-linux-gnueabi-gcc -I"/path to my/include" -O0 -g3 -Wall -c -fmessage-length=0 -v -MMD -MP -MF"main.d" -MT"main.d" -o "main.o" "../main.c"
You have installed OpenSSL on an Ubuntu 32-bit machine (assuming x86), but are trying to link it to an ARM binary:
/home/tashimaya/Applications/CodeSourcery/bin/../lib/gcc/arm-none-linux-gnueabi: your ARM toolchain
/usr/local/ssl/lib/libssl.a: a 32-bit x86 version of OpenSSL
You will have to cross-compile OpenSSL for ARM using your ARM toolchain (i.e.: arm-none-linux-gnueabi-gcc), then you will be able to link it to an ARM binary.
It says that /usr/local/ssl/lib/libssl.a is not in the size expected. Try file on it to check if you compiled it in 32 or 64 bit version. And check how you are compiling your own program too. If both matches linker (ld) will link it fine.
If you compile your program into 64 bit and link it with libssl.a in 32 bit, this will not work
example:
file a.out
/* kind ofoutput */ a.out: Mach-O 64-bit executable x86_64
http://unixhelp.ed.ac.uk/CGI/man-cgi?file

Cross Compiled Binary Does Not Exist?

I cross compiled a program on Ubuntu 12.04 running on x86 using gcc-arm-linux-gnueabi and binutils-arm-linux-gnueabi and compiling with arm-linux-gnueabi-gcc instead of gcc with my target architecture being ARM. It compiles fine with no errors or warnings.
When I try to run it on the ARM machine (Pandaboard - also running Ubuntu 12.04) I get:
bash: ./sttyl: No such file or directory
I know the file is there and it has the proper permissions:
-rwxrwxr-x 1 haziz haziz 8.5K Feb 10 10:34 sttyl
The output of file sttyl is
sttyl: ELF 32-bit LSB executable, ARM, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.31,BuildID[sha1]=0x6d504f7d84b93603122223a89e2b5960c840309f, not stripped
When I compile it natively on the Pandaboard it compiles and runs fine. This is the output of file sttyl on the natively compiled copy:
sttyl: ELF 32-bit LSB executable, ARM, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.31,BuildID[sha1]=0x9897c785266c5b7cdf580e08091aba239db84ecf, not stripped
What am I doing wrong? Moreover if I had made a mistake in the cross compilation I would have expected the shell/kernel to tell me effectively that the executable is for the wrong architecture not that it does not exist!
It isn't telling you that ./sttyl doesn't exist.
It's telling you that it spawned a new process, which exited with error No such file or directory. No information is provided about which file or directory was missing.
For example, a shell script starting with #!/bin/specialsh could generate that error if the interpreter /bin/specialsh was missing, even though the script existed and was executable.
Try using strace to find out what call (and path) caused the error.

Resources