Link static library to shared library or to a binary - meson-build

I have a static library from the project A (let's call it liba.so) and I want to compile a shared library in my project B (let's call it libb.so) and embed liba.so in it.
Also, I have a binary in that project B which also depends on liba.so, so I want to embed it in the binary.
Is that possible? How?

When A is a Separate Code Base
What you do is build and install project A. Then create a dependency on project A in project B's definition.
That looks like this:
a_dep = dependency('a', version : '>=1.2.8')
lib_b = shared_library('proj_b', sources: 'prog_b.c', dependencies : a_dep)
The version section in dependency is optional.
When A is in the Same Meson Project as B
When A and B are in the same meson project, it's a little uglier. You have to declare a dependency anchor in A.
That looks like this:
incdirs = include_directories('include')
lib_a = static_library('a', 'proj_a.c', include_directories : indirs)
liba_dependency = declare_dependency(
include_directories : incdirs,
link_with : lib_a,
sources : ['proj_a.c'])
Then project B becomes:
lib_b = shared_library('proj_b', sources: 'prog_b.c', dependencies : lib_a)

If you have an existing, precompiled library, then you can directly wrap it in a dependency:
cpp = meson.get_compiler('cpp')
# (Meson requires an absolute path for find_library().)
libdir = meson.current_source_dir() + './lib/
precompiledA_dep = cpp.find_library('A', dirs : libdir) # ./lib/libA.lib
...
# Link against libA.lib here ...
B_lib = library('libB', 'libB.cpp', dependencies : precompiledA_dep)
B_exe = executable('exeB', 'source.cpp', dependencies : precompiledA_dep)
(tested with Meson 0.57)

Related

How to combine STM32 HAL library with CMake, as it need file from project it linked to [duplicate]

i have recently switched my stm32 project to CMake to be independent on IDE. Root repository (application) contains multiple submodules (HAL, FreeRTOS etc.) and its CMakeLists.txt includes explicitly every single used file:
set(EXECUTABLE ${PROJECT_NAME}.elf)
add_executable(${EXECUTABLE}
# Own sources
src/main.c
src/SEGGER_SYSVIEW_Config_FreeRTOS.c
src/startup_stm32h723zgtx.s
src/stm32h7xx_hal_timebase_tim.c
src/system_stm32h7xx.c
# Base CMSIS and HAL library
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_tim.c
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_tim_ex.c
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_uart.c
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_rcc.c
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_rcc_ex.c
#long list of HAL c files there...
# FreeRTOS library
lib-freertos/croutine.c
lib-freertos/event_groups.c
lib-freertos/list.c
lib-freertos/queue.c
lib-freertos/stream_buffer.c
lib-freertos/tasks.c
lib-freertos/timers.c
lib-freertos/portable/GCC/ARM_CM7/r0p1/port.c
lib-freertos/trace/Sample/FreeRTOSV10/SEGGER_SYSVIEW_FreeRTOS.c
lib-freertos/trace/SEGGER/Syscalls/SEGGER_RTT_Syscalls_GCC.c
lib-freertos/trace/SEGGER/SEGGER_RTT_ASM_ARMv7M.S
lib-freertos/trace/SEGGER/SEGGER_RTT_printf.c
lib-freertos/trace/SEGGER/SEGGER_RTT.c
lib-freertos/trace/SEGGER/SEGGER_SYSVIEW.c
)
target_include_directories(${EXECUTABLE}
PRIVATE
include
src
lib-hal/stm32h7xx/CMSIS/Include
lib-hal/stm32h7xx/CMSIS/Device/ST/STM32H7xx/Include
lib-hal/stm32h7xx/STM32H7xx_HAL_Driver/Inc
lib-freertos/include
lib-freertos/trace/Config
lib-freertos/trace/SEGGER
lib-freertos/trace/Sample/FreeRTOSV10/
lib-freertos/portable/GCC/ARM_CM7/r0p1
)
This solution works but i know it is not a sustainable approach. So i tried to create library in lib-hal and lib-freertos submodules, specifying their sources and includes
add_library(lib-hal-stm32h7xx)
target_include_directories(lib-hal-stm32h7xx
PUBLIC
CMSIS/Include
CMSIS/Device/ST/STM32H7xx/Include
STM32H7xx_HAL_Driver/Inc
PRIVATE
STM32H7xx_HAL_Driver/Src
)
target_sources(lib-hal-stm32h7xx
PRIVATE
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_tim.c
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_tim_ex.c
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_uart.c
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_rcc.c
STM32H7xx_HAL_Driver/Src/stm32h7xx_hal_rcc_ex.c
#long list of HAL c files there...
)
and then using
add_subdirectory(lib-hal/stm32h7xx)
add_subdirectory(lib-freertos)
and
target_link_library(${EXECUTABLE} lib-freertos lib-hal-stm32h7xx)
to "import" submodules into application project. But when building the executable, gcc cannot access files stm32h7xx_hal_conf.h and FreeRTOSConfig.h which are located in root directory include. I do not want to put configuration headers into submodules because they are used in multiple projects with different configurations. Is it possible to somehow extend already specified directory search scope for library after adding it into parent project?
File structure of project:
-src
-include (configuration for lib-hal and lib-freertos included there)
-lib-hal
-includes...
-sources...
-lib-freertos
-includes...
-sources...
Thanks in advance for response.
As Tsyvarev mentioned in the comments, you can modify the properties of the target in your project. To keep things clean, I usually create a function for this and place it in a separate file.
Tip: you can also add source files to the target. In case of FreeRTOS, you could add architecture-specific files, in case all your projects don't run on the same MCU family.
function(configure_freertos target_name)
target_sources(${target_name}
PRIVATE
lib-freertos/portable/GCC/ARM_CM7/r0p1/port.c
)
target_include_directories(${target_name}
PUBLIC
include
lib-freertos/portable/GCC/ARM_CM7/r0p1
)
endfunction()

Rust Workspace: Is it possible to use a binary crate for integrationstests in lib crate?

I have the following workspace structure:
[workspace]
members = [
"skserver", # binary
"skclient", # binary
"skcommon", # lib
"skintegrationtests" # lib
]
The intention was to have an extra lib crate for integration testing of client/server-functionality. The Cargo.toml of skintegrationtests is as follows:
# for integration tests of own programs etc.
skcommon = {path = "../skcommon"}
skclient = {path = "../skclient"}
skserver = {path = "../skserver"}
skcommon can be referenced, but not skclient (I haven't tried skserver). Is that intentional from Rust? And if so, why?
I started doing integrationtests with skcommon. I want to avoid circular dependencies with skclient and skserver, and so I created skintegrationtests.
If you want to run the skclient binary from skintegrationtests, then you're looking for RFC 3028 binary dependencies, which are not yet implemented. There isn't a clean way to do this yet other than a build script separate from Cargo that makes sure the binary is built and then runs the test.
If you want to call functions defined in the skclient package's code, then you need to modify skclient so it is a library package — has a lib.rs — and all of the functions wanted are defined there rather than main.rs. This does not prevent it from also having a binary, which can refer to the library as use skclient::whatever;.

meson: How to get a target 's name in meson

I add a shared_library target in meson.build
libmali = shared_library(
'mali',
dummy_source,
install : true,
version : meson.project_version()
)
I want to get the libmali's name "mali" by code elsewhere in this meson.build.
how to get?
is there any api like libmali.getname() ?
Yes, but only since Meson 0.45; shared_library() returns a build target object, which has a method name() since the aforementioned version.

How to setup a CMAKE imported target for plugin components?

I have a precompiled libary that also employes dynamically loaded plugins.
Library L (compoesd ba a library.lib and library.dll)
Plugin P (composed only by a plugin.dll)
I am defining the imported target of L as:
add_library(L SHARED IMPORTED)
set_target_properties(L PROPERTIES
IMPORTED_LOCATION_RELEASE library.dll
IMPLIB_LOCATION_RELEASE library.lib
)
set_target_properties(L PROPERTIES
INTERFACE_LINK_LIBRARIES P
)
How do I define the imported target for P and its properties?
If I define it as:
add_library(P MODULE IMPORTED)
set_target_properties(P PROPERTIES
IMPORTED_LOCATION_RELEASE plugin.dll
)
Then the generated projects using L will erroneously consider plugin.dll as the lib to be linked.
I would like instead to keep the dependency (so that I can transitively install plugin.dll) but avoid L to link target P
I have ended up solving this by not linking L to P using INTERFACE_LINK_LIBRARIES.
I am configuring L by adding an additional variable containing its plugins:
LIST(APPEND L_PLUGINS P)
The targets using L can get access to its plugins by simply using the variable ${L_PLUGIN} (e.g. in order to install its files)
NB: This is the same approach used by Qt plugin components

How can I create a simple scala.js cross project with client, server and api modules?

I would like to create a very simple Scala.js application with three modules under a project root like this:
project
-server
-client
-api
This is a cross project because I would like the source in the api module to be compiled by both Scala and Scala.js. I understand that this might nescessitate the need for two api modules (which share the same source code), jvm type and js type.
Source in the server module should only be compiled by Scala and source in the client module should only be compiled by Scala.js. The server module needs to depend on the api module (jvm type) whereas the client module needs to depend on the api module (js type).
Could someone please post the most basic build.sbt that shows how this can be achieved?
EDIT:
It looks like this is the way to do it:
lazy val commonSettings = Seq(
scalaVersion := "2.12.1"
)
lazy val projectRoot = project.in(file(".")).
aggregate(client, server).
settings(
name := "projectRoot"
)
lazy val server = project.in(file("server")).
settings(
commonSettings
).
dependsOn(apiJvm)
lazy val client = project.in(file("client")).
settings(
commonSettings
).
enablePlugins(ScalaJSPlugin).
dependsOn(apiJs)
lazy val api = crossProject.in(file(".")).
settings(
commonSettings
).
jvmSettings(
// Add JVM-specific settings here
).
jsSettings(
// Add JS-specific settings here
)
lazy val apiJvm = api.jvm.in(file("apiJVM"))
lazy val apiJs = api.js.in(file("apiJS"))
The only problem with this is the shared source goes into a folder called shared with a different module name in IntelliJ. It's a shame it isn't in an api folder with a module name of api. Presumably I shouldn't put any code in the apiJS and apiJVM modules that get created? They are only there to be used as dependencies?