Build and link µIP library with no OS - embedded

I'm relitavely new to embedded development and I have a question, or more of a feedback, on building and linking the µIP library on an embedded device. For what it's worth, the following is using a FOX G20 V board with an ATMEL AT91SAM9G20 processor with no OS.
I have done some research, and the way I see myself building and linking the library on the board is one of the following two options.
Option 1: The first option would be to compile the whole library (the .c files) in order to have a built static library in the form of a .a file. Then, I can link the created static library with my application code, before loading it on the device. Of course, the device driver will have to be programmed in order to allow the library to work on the platform (help was found here). This first option is using a Linux machine. For this first option as well, in order to load the static library linked with my application code, do I do so with an "scp"?
Option 2: The second option would be to compile and link the library to my application code directly without going through an intermediate static library. However, since my platorm does not contain an OS, I would need to install an appropraite GCC compiler in order to compile and link (if anyone has any leads for such an installation, that would be very helpful as well). However I'm quite unfamilier with the second option, but I've been told that it is easier to implement so if anyone as an idea on how to implement it, it would be very helpful.
I would appreciate some feedback along with the answers as to whether these options seem correct to you, and to be sure that I have not mentioned something that is false.

There is no real difference between these options. In any case, the host toolchain is responsible for creating a binary file that contains a fully linked executable with no external dependencies, so you need a cross compiler either way, and it is indeed easiest to just compile uIP along with the rest of the application.
The toolchain will typically have a cross compiler (if you use gcc, it should be named arm-eabi-gcc or arm-none-eabi-gcc), cross linker (arm-eabi-ld), cross archiver (arm-eabi-ar) etc. You would use these instead of the native tools. For Debian, you can find a cross compiler for ARM targets without an OS in testing/unstable.
Whether you build a static library
arm-eabi-gcc -c uip.c
arm-eabi-ar cru uip.a uip.o
arm-eabi-ranlib uip.a
arm-eabi-gcc -o executable application.c uip.a
or directly link
arm-eabi-gcc -c application.c
arm-eabi-gcc -c uip.c
arm-eabi-gcc -o executable application.o uip.o
or directly compile and link
arm-eabi-gcc -o executable application.c uip.c
makes no real difference.
If you use an integrated development environment, it is usually easiest to just add uip.c as a source file.

Related

Static library GNUSTEP and correct linking

I was checking out the portability of Objective-C via gnustep and ran into some problems...
I mean everything works on my 2 machines but the major problem is if I run my application on a platform where gnustep is not pre-installed... So I want to build it with static libraries. But I ran into several problems:
1.) I cant find the static libaries under /usr/local/lib so the question came up do they even exist within gnustep?
2.) In case there are static libraries available how to integrate it correctly into my gcc command?
sudo gcc -o main main.m GameRef.m SDLApplication.m SDLEvent.m SDLImage.m SDLMap.m SDLSprite.m Settings.m Utility.m -I -static `gnustep-config --variable=GNUSTEP_SYSTEM_HEADERS` -L `gnustep-config --variable=GNUSTEP_SYSTEM_LIBRARIES` -lgnustep-base -lSDL -fconstant-string-class=NSConstantString -std=c99 2>logFile
I'm currently using Ubuntu 12.04LTS and installed the SDL and Gnustep on one machine so the application runs fine... But not on the second because the shared libraries are missing so I need to add them as static but how?
The libraries in /usr/local/lib and other system 'lib' directories will be dynamic. They can't be used as static (AFAIK), and finding them wouldn't really help.
I'm no expert with GNUstep, but it sounds like you are missing the Objective-C runtime. You will need to download the source code of the GNUstep libraries and frameworks, and then compile them into static libraries yourself.
Really, wrapping all of those frameworks into your application will just add unnecessary work for both you and your end users. Dynamic libraries exist for a purpose. There's no reason to have multiple copies of the same code on the filesystem. Just require GNUstep as a dependency. Although its a slight pain for the users, they only need to do it once, and with most distros, installation is only a command or two away.

Building a cross-platform application (using Rust)

I started to learn Rust programming language and I use Linux. I'd like to build a cross-platform application using this language.
The question might not be related to Rust language in particular, but nonetheless, how do I do that? I'm interested in building a "Hello World" cross-platform application as well as for more complicated ones. I just need to get the idea.
So what do I do?
UPDATE:
What I want to do is the ability to run a program on 3 different platforms without changing the sources. Do I have to build a new binary file for each platform from the sources? Just like I could do in C
To run on multiple platforms you need to build an executable for each as #huon-dbauapp commented.
This is fairly straightforward with Rust. You use "--target=" with rustc to tell it what you want to build. The same flag works with Cargo.
For example, this builds for an ARM target:
cargo build --target=arm-unknown-linux-gnueabihf
See the Rust Flexible Target Specification for more about targets.
However, Rust doesn't ship with the std Crate compiled for ARM (as of June 2015). If this is the case for your target, you'll first need to compile the std Crates for the target yourself, which involves compiling the Rust compiler from source, and specifying the target for that build!
For information, most of this is copied from: https://github.com/japaric/ruststrap/blob/master/1-how-to-cross-compile.md
The following instructions are for gcc, so if you don't have this you'll need to install it. You'll also need the corresponding cross compiler tools, so for gcc:
sudo apt-get install gcc-arm-linux-gnueabihf
Compile Rust std Crate For ARM
The following example assumes you've already installed the current Rust Nightly, so we'll just get the sources and compile for ARM. If you are using a different version of the compiler, you'll need to get that to ensure your ARM libraries match the version of the compiler you're using to build your projects.
mkdir ~/toolchains
cd ~/toolchains
git clone https://github.com/rust-lang/rust.git
cd rust
git update
Build rustc for ARM
cd ~/toolchains/rust
./configure --target=arm-unknown-linux-gnueabihf,x86_64-unknown-linux-gnu
make -j4
sudo make install
Note "-j4" needs at least 8GB RAM, so if you hit a problem above try "make" instead.
Install ARM rustc libraries In native rustc build
sudo ln -s $HOME/src/rust/arm-unknown-linux-gnueabihf /usr/lib/rustlib/arm-unknown-linux-gnueabihf
Create hello.rs containing:
pub fn main() {
println!("Hello, world!");
}
Compile hello.rs, and tell rustc the name of the cross-compiler (which must be in your PATH):
rustc -C linker=arm-linux-gnueabihf-gcc-4.9 --target=arm-unknown-linux-gnueabihf hello.rs
Check that the produced binary is really an ARM binary:
$ file hello
hello: ELF 32-bit LSB shared object, ARM, EABI5 version 1 (SYSV), (..)
SUCCESS!!!:
Check: the binary should work on an ARM device
$ scp hello me#arm:~
$ ssh me#arm ./hello
Hello, world!
I've used this to build and link a Rust project with a separate C library as well. Instructions similar to the above on how to do this, dynamically or statically are in a separate post, but I've used my link quota up already!
The best way to figure this out is to download the source code for Servo and explore it on your own. Servo is absolutely a cross-platform codebase, so it will have to address all of these questions, whether they be answered in build/configuration files, or the Rust source itself.
It looks like the rust compiler might not be ready to build standalone binaries for windows yet (see the windows section here), so this probably can't be done yet.
For posix systems it should mostly Just Work unless you're trying to do GUI stuff.
Yes, you won't need to change the source, unless you are using specific libraries that are not cross-platform.
But as #dbaupp said native executables are different on each platform, *nix uses ELF, Windows PE, and OSX Mach-O. So you will need to compile it for each platform.
I don't know the state of cross-compiling in rust, but if they already implemented it, then you should be able to build all the binaries in the same platform, if not, you will have to build each binary on it's platform.

How do you use libtool to create .a files (static libraries) on Mac OS?

When it comes to using the terminal to build libraries manually and such I unfortunately do not have much experience and I'm stuck a bit here.
I've downloaded a library for objective-c which came with makefiles and such.
I can see that the folder also contains an executable file called "libtool", I did some searching and I suppose this is the program I have to use to build the neccessary .a files? Unfortunately I couldn't really find any useful article for this that seemed to work.
The folder for the library contains some .sh files, .pc files and also some .la files, but I'm a bit unsure of which ones I have to use as input to the libtool program to compile them into a .a file.
So my question is what files do you have to input into libtool to compile them into the necessary .a file? And what commands do you use exactly to accomplish this?
Thank you all for your time :)
First a little introduction to static libraries:
Static libraries in Unix environments (like Mac OSX, and Linux too) are actually just an archive of object files created by the ar command line program.
That is what the .a extension stands for: Archive.
To create a static library with some object files you can use the command like this:
ar crv libmy_library.a objectfile1.o objectfile2.o
As for your actual question, libtool should be called automatically from the makefile, creating the library, which is the file ending in .la. However, this is not the real library, the real library is in a hidden directory. You can find it by doing e.g.
find . -name '*.a'
But like I said, the makefile should already take care of everything, including installing the correct library in the correct place when you do e.g. make install.
For information about libtool, see this site.

Eclipse managed make with static and dynamic linked libraries at the same time

I am using the managed make functionality of Eclipse CDT. Creating the project using dynamic only libraries is working as expected. But the boost_unit_test_framework should be linked statically, because it contains the main function. On the command line it is not a problem to link to dynamic and static libraries in a mixture. So this is a working example:
g++ -L../Debug -L../boost/lib -o "Test" ./Test.o -ldynLib -Wl,-Bstatic -lboost_unit_test_framework -Wl,-Bdynamic
The dynlib and the standard libraries like libc are linked dynamically and the boost_unit_test_framework is linked statically. BUT how can I enter this information in the Settings of the Project? I can not see any way.
It may be possible to flag this library in every project for static linking, for example in a global place. There is convention used by QNX ([manual]). It is possible to use LIBPREF_library and LIBPOST_library to add Options before or after the specified library.
Update:
I have still no clue how to solve the described problem. But in the meantime I have switched my build system from Managed Make to CMake. And additionally I am now using the Qt Creator because it is able to index boost and does not freeze the UI while updating some internal structures ...
[manual] http://www.qnx.com/developers/docs/6.3.0SP3/neutrino/prog/make_convent.html#USEMAC
I don't think you need to specify the type of linking. Dynamic libraries can't be linked statically, and vice versa. On one of my projects, under Project Properties -> C/C++ Build -> Settings, I have both static and dynamic libraries listed under Libraries. It seems to work out what type they are and link fine either way.
Dynalic libraries goes in : Linker/Libraries/Libraries (-l)
Static libraries goes in : Linker/miscelanous/Other files and objects

Build System and portability

I'm wondering how i can make a portable build system (step-by-step), i currently use cmake because it was easy to set up in the first place, with only one arch target, but now that i have to package the library I'm developing I'm wondering how is the best way to make it portable for arch I'm testing.
I know I need a config.h to define things depending on the arch but I don't know how automatic this can be.
Any other way to have a build system are warmly welcome!
You can just use CMake, it's pretty straightforward.
You need these things:
First, means to find out the configuration specifics. For example, if you know that some function is named differently on some platform, you can use TRY_COMPILE to discover that:
TRY_COMPILE(HAVE_ALTERNATIVE_FUNC
${CMAKE_BINARY_DIR}
${CMAKE_SOURCE_DIR}/alternative_function_test.cpp
CMAKE_FLAGS -DINCLUDE_DIRECTORIES=xxx
)
where alternative_function_test.cpp is a file in your source directory that compiles only with the alternative definition.
This will define variable HAVE_ALTERNATIVE_FUNC if the compile succeeds.
Second, you need to make this definition affect your sources. Either you can add it to compile flags
IF(HAVE_TR1_RANDOM)
ADD_DEFINITIONS(-DHAVE_TR1_RANDOM)
ENDIF(HAVE_TR1_RANDOM)
or you can make a config.h file. Create config.h.in with the following line
#cmakedefine HAVE_ALTERNATIVE_FUNCS
and create a config.h file by this line in CMakeLists.txt (see CONFIGURE_FILE)
CONFIGURE_FILE(config.h.in config.h #ONLY)
the #cmakedefine will be translated to #define or #undef depending on the CMake variable.
BTW, for testing edianness, see this mail
I have been using the GNU autoconf/automake toolchain which has worked well for me so far. I am only really focussed on Linux/x86 (and 64bit) and the Mac, which is important if you are building on a PowerPC, due to endian issues.
With autoconf you can check the host platform with the macro:
AC_CANONICAL_HOST
And check the endianness using:
AC_C_BIGENDIAN
Autoconf will then add definitions to config.h which you can use in your code.
I am not certain (have never tried) how well the GNU autotools work on Windows, so if Windows is one of your targets then you may be better off finding similar functionality with your existing cmake build system.
For a good primer on the autotools, have a look here:
http://www.freesoftwaremagazine.com/books/autotools_a_guide_to_autoconf_automake_libtool