How does one package inter-dependent libraries with Conan? - conan

I have a project for which I am trying to use conan as the package manager.
The project uses a large number of libraries, most of which are self-sufficient. However, some of them depend on another library.
To give a concrete example, let's say we have a project P that requires libraries A and B. A is self contained, but B depends on A for its compilation and linking.
I can easily create the conanfile.py for library A. I can create a conanfile.txt for project P. Assuming A and B have not been build yet, I want to be able to type in P's build directory:
conan install ../ --build=missing
and have conan download, compile and install library A, THEN download compile and install library B, with B having the correct references to A.
What is the proper approach in writing the conanfile.py of B?

When you write the package recipe for package B, you specify that it depends on A:
class PackageB(ConanFile):
requires = "A/1.0#user/stable"
When you specify the dependencies in your project (either with a conanfile.txt or a conanfile.py), you specify as usual your dependencies. Conan handle transitive dependencies, so it knows that it has to build (or retrieve package binary if desired) package A first, and then package B.
It is typical that the build script for package B has to take into account the dependency to A. If using CMake, the solution would be using the cmake generator, and consuming the conanbuildinfo.cmake that has the include directories, library names, etc. for package A.
The docs specify a bit more about the syntax:
http://docs.conan.io/en/latest/reference/conanfile.html#requirements
You can check some of the existing packages which are already managing transitive dependencies:
ZMQ c++ wrapper, which depends on zmq, and is a very simple example as the wrapper is header only: https://www.conan.io/source/zmqcpp/4.1.1/memsharded/stable
Boost (depends on Zlib, BZip, but conditionally, check inside the config() method of the conanfile): https://www.conan.io/source/Boost/1.60.0/lasote/stable
Poco, also depends conditionally on OpenSSL, MySQLClient: https://www.conan.io/source/Poco/1.7.3/lasote/stable

Related

Maintaining multiple projects which consume conan packages

Background:
I have a Visual Studio solution(s) with multiple (50+) projects (libraries static/dynamic and final executables). There is internal Visual Studio reference mechanism used to comsume required libraries for particular executables. Of course each project uses external packages, there are "duplicates" like boost, gtest, there are also some "unique" references for only one or few projects.
What's more, libraries are used in other solutions (project sharing) to deploy other executables.
This is my general project structure:
MainDir
|
- DebugDlls (build output)
- Debug64Dlls (build output)
- ReleaseDlls (build output)
- Release64Dlls (build output)
- Libraries
|
- lib1
- lib2
- ...
- Executables
|
- exe1
- exe2
...
I'm about to migrate from NuGet to conan as a dependency manager for external libraries since there are more ready to use conan packages that NuGet one and it's cross-platform. I'd like to do it project by project, dependency by dependency.
One global conan file to rule them all is not an option since each library has to be as standalone as possible so I'm able to simply grab one and use for new executable. What's more it would be impossible to track dependencies of particular library or executable.
My idea is to put a separate conanfile in each project and define dependencies.
Here is the first issue: I need some global/automatic management of common libraries like boost to not mix versions/variants and spare some time on version updates.
this one may be handled by a global file which defines reusable depndencies
is there something ready to use in conan, like template?
Second issue is to copy dlls from dependencies into proper build output so I'm able to execute the binaries.
this one should be fixable also by some global file with proper defines.
Third one is to execute conan install in each project
once again, a hand crafted script will do the job.
I was digging across the conan documentation but it's not very well organized and I was unable to find proper solution in my case. Maybe I missed something?
What would be the best approach here? Is there any build in conan mechanism for that (like CMake add_subdirectory). I would not like to reinvent the well if one already exists :)
I'm about to use conan 1.x

Should I supply external libraries with a CMakeLists.txt or supply find_packages instead?

I am working on a project that needs some external libraries. Since it is meant to be cross platform, I am using cmake.
What is the preferred way when distributing such projects? Should I supply the external libraries (such as zlib) with their own CMakeLists.txt or should I signal the dependency by simply supplying find_packages()?
the former provides all things needed. while the latter let's the developer decide how to supply the dependency (vcpkg for example)
Althoug there is no universally preferred approach, I absolutely believe you should stick to find_package. Declare your dependencies like this:
find_package(Pkg [version] REQUIRED [components])
Include [version] and [components] only if you know Pkg itself provides first-party CMake package configuration files. If you are writing and distributing a library, you will include equivalent find_dependency calls in your MyProjConfig.cmake file.
If some dependency does not have a standard CMake find module or provide its own CMake package configuration file, you should write your own in ./cmake and add list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake") to the root CMakeLists.txt, before any find_package call. You will install your find modules, too, and include the same addition to the module path in your config files.
Inside the find module, you can use whatever approach you want to create some imported targets for your dependencies. Using PkgConfig is a good approach here.
Going through find_package instantly works with a number of dependency providers: vcpkg, the cmake_paths Conan generator, Linux distro system packages, and so on.
The primary alternative to doing this is to vendor the code, meaning including your dependencies in your build directly, whether through copy/paste into your source tree, a git submodule, or by build-time download from the internet (FetchContent).
The mechanism used to build these is nearly always add_subdirectory in the end, which pulls your dependencies' CMake builds into yours.
Perhaps the biggest issue with this is that most projects' CMake code is totally unprepared to be used in this way. It might trample your cache variables, inject invalid flags into your targets, overwrite your generated headers, and so on. Integration is a nightmare.
Also, from a software distribution standpoint, doing this ties your code to particular versions of your dependencies and takes control away from others who might want to package your code. For instance, Debian packages are not allowed to bundle their dependencies... if libA depends on libB, then each gets its own package. With find_package, it is trivial for a maintainer to inject the appropriate dependencies into your build. Without, it typically involves a difficult-to-maintain patch.

conan.io package management - source only package

this is not a request about header-only packages. Those are straightforward
I've got a cross-platform library which I'd like to not package with any .a (or similar) prebuilt binaries but rather indicate its .cpp must be built along with the consuming application (add_subdirectory style).
The only way I see to do this are:
conan install -build style
Set up conan build profiles
Both of those two are reasonable, yet seem to be more effort than needed for a consumer who "just wants to recompile the C++" with whatever toolchain their top level CMake is pointed towards.
So, in other words, can conan serve similarly as a delivery mechanism for a git retrieve/add_subdirectory and present it as a CONAN_PKG?

CMake: how to install only shared libraries found via find_library?

I've got several third-party libs, some them shared, some static, I need to install the shared ones.
Currently I'm doing find_library's, have a list of all the needed libs and pass it to install(FILES ...).
But this way both .a and .so libs are installed.
With install(TARGETS ...) there is a separation on RUNTIME, ARCHIVE etc.
But I do not want to create a dummy target for each of the libs.
I also do not want to separate libs into shared and static (there is another separation already).
Is there a nicer way for me to filter for shared libs only than just to regex the filename? Maybe libraries are treated as something 'more' than just filepaths after find_library so I somehow can get library type from them?
Let's see what is going on.
You install some libraries via you package manager (yum/dnf/apt-get/sth else).
You build your app.
You distribute your app.
If so, you should not ask cmake to install those libs, because if someone else would like to install same thirdparty libs via another rpm package, it would create conflict (and one package would have to be removed) - it's mess.
The place which manage library dependecies is package manager - create rpm package which would have:
Requires: all_your_dynamic_libs
BuildRequires: all_your_static_libs
If point 1 is rather - You install some libraries by make && make install then you should firstly create rpm package for that.
It is kind of a pain to create all this additional work, but trust me, you DO NOT want to create all-in-one package.

In CMake, is it possible to build a dependency imported from a build tree?

I am trying to use the CMake feature for exporting/importing targets from a build tree (see this wiki page). I have this dependency library:
add_library(dependency SHARED dependency.cpp)
export(TARGETS dependency FILE dependency-targets.cmake)
And an executable uses this library in another project:
include(${DEPENDENCY_PATH}/dependency-targets.cmake)
add_executable(main-app main.cpp)
target_link_libraries(main-app dependency)
This works fine. While I do understand that this export/import mechanism "only" provide a convenient way to reference external binaries, I am wondering whether dependency could be compiled when running make in main-app? Either using the import mechanism (which I doubt) or using another one ?
You could look into the "superbuild" pattern and ExternalProject.
The gist of the idea is that you set up one "superbuild" project which will use just ExternalProject_Add() commands; this will set up your real project and all its dependencies.