Simple way to add a Premake subproject inside a CMake project? - cmake

I use CMake for my C++ projects and I like to have my dependencies as Git submodules inside a third-party folder so I can have easy access to the code and the possible CMake targets I can link with.
.
├── CMakeLists.txt
├── src
│   └── main.cpp
└── third-party
└── CMakeSubmodule # OK
└── CMakeLists.txt # OK
My root CMakeLists.txt would typically contain :
add_subdirectory(src)
add_subdirectory(third-party/CMakeSubmodule)
target_link_libraries(myTarget PRIVATE SubmoduleTarget)
Now, I want to add a submodule that only uses premake :
.
├── CMakeLists.txt
├── src
│   └── main.cpp
└── third-party
└── PremakeSubmodule # !
└── premake5.lua # !
How would you do to simply add the submodule as a dependency ?
Do I have to manually convert all premake files to CMakeLists.txt ?
A convenient solution would allow to directly link to a target as follow, so I don't have to manually set the correct include directories and link to the lib files :
target_link_libraries(myTarget PRIVATE PremakeTarget) # Would be great

Related

Build and run an executable locally with Conan

Say I have a simple hello world project in CMake that creates a binary under bin.
I build it with Conan using the latest CMake module.
In the conanfile.py, is this package method enough to announce the executable?
def package(self):
cmake = CMake(self)
cmake.install()
Then, if I want to build the project and run the executable locally, but in Conan's context, which commands should I type?
The package method you've defined will use the installation structure defined by cmake.install() to define the package structure. For example, if your cmake.install() method installs binaries to the bin directory, the bin directory will be present in the package folder in your conan cache, i.e. ~/.conan/data/<package>/<version>/<user>/<channel>/package/<package_id>/bin.
This alone is enough to run the executable locally - you can execute it from the above path, put it onto your PATH - whatever you need. It's not convenient though. To add some convenience, you can use the VirtualRunEnv generator to consume executables. To illustrate with cmake:
$ conan install cmake/3.22.4# --build=missing -g VirtualRunEnv
This will install cmake 3.22.4 into your local cache, and generate the following files in your cwd:
.
├── conanrunenv-release-x86_64.sh
├── conanrun.sh
├── deactivate_conanrunenv-release-x86_64.sh
└── deactivate_conanrun.sh
You can use this in the same way you would a python virtual environment:
$ source conanrun.sh
$ which cmake
/home/user/.conan/data/cmake/3.22.4/_/_/package/5c09c752508b674ca5cb1f2d327b5a2d582866c8/bin/cmake
And to restore the environment:
$ source deactivate_conanrun.sh
Restoring environment
$ which cmake
/usr/bin/cmake
The second method is to use the deploy generator:
$ conan install cmake/3.22.4# --build=missing -g deploy
This will grab cmake and all of it's dependencies from the conan cache and dump them to your cwd. After running this command, the directory looks like the following:
.
├── cmake
│   ├── bin
│   │   ├── ccmake
│   │   ├── cmake
│   │   ├── cpack
│   │   └── ctest
│   ├── licenses
│   │   └── Copyright.txt
│   └── share
│   ├── aclocal
│   ├── bash-completion
│   ├── cmake-3.22
│   ├── emacs
│   └── vim
├── deploy_manifest.txt
└── openssl
├── bin
│   ├── c_rehash
│   └── openssl
├── include
│   └── openssl
├── lib
│   ├── cmake
│   ├── libcrypto.a
│   └── libssl.a
└── licenses
└── LICENSE
You can then move this wherever you need to and put it on your system PATH if you so desire. The same principle would apply to your package - create the recipe in the local cache, and use generators to consume it.
To illustrate:
Place your package in the conan cache
$ conan create .
Consume it using VirtualRunEnv
$ conan install mypkg/0.1.0#user/channel --build=missing -g VirtualRunEnv
Consume it using deploy
$ conan install mypkg/0.1.0#user/channel --build=missing -g deploy
Instead of issuing conan install commands above, you can also list your package as a requirement in a conanfile.txt or conanfile.py for a consumer package, i.e. for a conanfile.py:
def requirements(self):
self.requires(mypkg/0.1.0)
self.requires(someotherpkg/1.2.0)
And then you can use the virtual environment generators to collect environment information for all dependencies. In the directory containing the conanfile.py:
$ conan install . -g VirtualRunEnv
Hope this helps.
References:
VirtualRunEnv generator
deploy generator
Mastering Conan Virtual Environments
virtualrunenv generator (deprecated, I believe)
Conan generators

How to compile all the libraries as static but one as shared?

My folder structure is like that:
├── SubLibA
│   ├── CMakeLists.txt
│   ├── include
│   │   └── SubLibA.h
│   └── SubLibA.cpp
├── SubLibB
│   ├── CMakeLists.txt
│   ├── include
│   │   └── structs.h
│   └── SubLibB.cpp
└── SharedLib
├── CMakeLists.txt
├── include
│   └── SharedLib.h
├── SharedLib.cpp
└── SharedLib.h
My global CMakeLists.txt looks like this:
add_subdirectory(SubLibA)
add_subdirectory(SubLibB)
add_subdirectory(SharedLib)
They all compile as static by default.
SharedLib depends on SubLibB that depends on SubLibA.
The dependent libraries SharedLib and SubLibB have:
#SubLibB
target_link_libraries(${PROJECT_NAME}
SubLibA::SubLibA
)
#SharedLib
target_link_libraries(${PROJECT_NAME}
SubLibB::SubLibB
)
Running cmake .. -DBUILD_SHARED_LIBS=ON compiles all the three libs as shared library...
Since they are tightly dependent, I'd like to keep them in the same repository with a unique CMakeLists.txt that compiles them all at once. I want to use the power of Modern CMake with the least hard-coded file and custom files as possible to keep a straightforward maintenance.
Try setting the variable within cmake:
set(BUILD_SHARED_LIBS OFF)
add_subdirectory(SubLibA)
add_subdirectory(SubLibB)
set(BUILD_SHARED_LIBS ON)
add_subdirectory(SharedLib)
set(BUILD_SHARED_LIBS OFF)
If you want SubLibA and SubLibB always be static libraries you can use the STATIC keyword on the add_library command, e.g. add_library(SubLibA STATIC ${SOURCES}) By omitting the keyword for SharedLib you are still free to build it as static or shared lib by setting -DBUILD_SHARED_LIBS=ON on the CMake command line.

Does CMake has QMake's .qmake.conf alternative (automatically included file from parent directory) or other means to achieve similar result?

Let's say we have repository structure like this (note the .qmake.conf files):
repo/
├── libraries
│   └── libFoo
│   └── libFoo.pri
├── projects
│   ├── ProjectX
│   │   ├── apps
│   │   │   └── AppX
│   │   │   └── AppX.pro
│   │   ├── libs
│   │   │   └── libX
│   │   │   └── libX.pri
│   │   └── .qmake.conf
│   └── ProjectY
│   ├── apps
│   │   └── AppY
│   │   └── AppY.pro
│   └── .qmake.conf
├── qmake
│   └── common.pri
└── .qmake.conf
QMake supports .qmake.conf files, where you can declare useful variables, and it is automatically included in your .pro file if found in parent directory.
This is how it helps to avoid dealing with ../../.. relative paths, for example:
Root repo/.qmake.conf file has REPO_ROOT=$$PWD declared.
project also has it's own repo/projects/ProjectX/.qmake.conf, which has include(../../.qmake.conf) included and PROJECT_ROOT=$$PWD declared.
project's application .pro file (repo/projects/ProjectX/apps/AppX/AppX.pro) can avoid writing ../../ and include all dependencies from sibling and parent directories like this:
include($${REPO_ROOT}/qmake/common.pri)
include($${REPO_ROOT}/libraries/libFoo/libFoo.pri)
include($${PROJECT_ROOT}/libs/libX/libX.pri)
This is convenient and tidy. You DO have to write ../../ once (and update it if repository tree changes), but only once per new .qmake.conf, and later you can use variables to refer to various useful relative paths in the repository in any number of .pro's you have.
Is three similar technique in CMake? How this kind of variable organization could be achieve with CMake, in most convenient way?
In CMake you can achieve similar result somewhat differently:
(regarding "useful variables" management)
CMake knows about 3 "types of variables":
vars with directory scope; directory scope variables behave in such a way that if you define them in some folder, they will automatically be visible in all subfolders. In brief, if you define some var in root CMakeLists.txt, it will be visible in all project subfolders. Example of defining "directory scope variable":
# outside any function
set(MY_USEFUL_VAR SOME_VALUE)
vars with function scope; function scope variables are variables defined within the function. They are visible in the current function scope and all scopes initiated from it. Example of function scope variable:
function(my_function)
# note that the definition is within the function
set(MY_LOCAL_VAR SOME_VALUE)
# rest of the function body...
endfunction()
cache variables may be considered as "global variables", and those are also stored within CMakeCache.txt file within the root build folder. Cache variables are defined as follows (adding a new string variable):
set (MY_CACHE_VAR "this is some string value" CACHE STRING "explanation of MY_VAR")
Also, as already suggested within the comments, you can place variables definitions into the various "include files" and include them using CMake include statement.
In the end, here is the documentation about set, and include CMake statements.

Is there any way to link add_custom_command to a target defined in another directory with add_custom_target?

Because this is the standard structure of a ROS workspace (robotics framework) I have the following two directories:
ros/
|── DIR_A
│   ├── CMakeLists.txt
│   ├── package.xml
│   ├── README.md
│   ├── setup.py
│   └── src
│   ├── some_code.cpp
|── DIR_B
│   ├── CMakeLists.txt
│   ├── package.xml
│   ├── README.md
│   ├── setup.py
│   └── src
│   ├── some_other_code.cpp
I am trying to add a custom target called clean_compiled that will run the make clean target of a specific external project (already works). But I also need to add a custom command to this clean target in the CMakeLists of another directory. This is what I have:
First CMakeLists.txt:
# We add the custom target
add_custom_target(clean_compiled
COMMENT "Cleaning compilation files in ${SOME_DESTDIR}"
COMMAND $(MAKE) clean PATH=${SOME_DESTDIR}
WORKING_DIRECTORY ${SOME_WORKING_DIRECTORY}
)
Second CMakeLists.txt:
# We try to add a custom command to this custom target
add_custom_command(TARGET clean_compiled
COMMENT "Cleaning compilation files in ${ANOTHER_DESTDIR}"
WORKING_DIRECTORY ${ANOTHER_WORKING_DIRECTORY}
COMMAND $(MAKE) clean PATH=${ANOTHER_DESTDIR}
)
When I run make clean_compiled for the whole repository, I get this warning:
TARGET 'clean_compiled' was not created in this directory.
This warning is for project developers. Use -Wno-dev to suppress it.
And the custom_command is never called.
Is there any way to link the custom command to the target in the other directory without causing circular dependencies?
add_custom_command is pretty clear that it only works with targets in the same directory. There is no way around this. So the second rule cannot be a custom command and needs to be a custom target, for example called clean_another_dest_dir.
https://discourse.cmake.org/t/create-target-that-only-runs-other-target/885/6
You can use add_dependencyto make custom targets depend on other custom targets and make them run, add_dependencies(clean_another_dest_dir clean_compiled), but that isn't always what you want.
Another option is a custom target that runs all the other custom targets.
add_custom_target(clean_all_dir
COMMAND ${CMAKE_COMMAND} --build ${CMAKE_BINARY_DIR } --target clean_compiled
COMMAND ${CMAKE_COMMAND} --build ${CMAKE_BINARY_DIR } --target clean_another_dest_dir
)

CTest add tests in subdirectories

I have a CMake-based project that consists of several sub-components, which can all be independently compiled and tested. The directory layout looks like this:
.
├── CMakeLists.txt
├── comp1
│   ├── CMakeLists.txt
│   ├── src
│   │   ├── foo.cc
│   │   └── foo.h
│   └── tests
│   ├── CMakeLists.txt
│   └── test_comp1.cc
└── comp2
├── CMakeLists.txt
├── src
│   ├── bar.cc
│   └── bar.h
└── tests
├── CMakeLists.txt
└── test_comp2.cc
I want to enable ctest, therefore in the root CMakeLists.txt I have include(CTest) and in the component-specific CMakeLists.txt files I have
if(BUILD_TESTING)
add_subdirectory(tests)
endif()
In compX/tests/CMakeLists.txt I have the code to compile the test and the add_test()command. The tests get successfully compiled and I can manually run them. However, if I call ctest, it returns
No tests were found!!!
After playing a bit with this, it turned out that if I move the add_subdirectory(tests) call to the root CMakeLists.txt like this:
if(BUILD_TESTING)
add_subdirectory(comp1/tests)
endif()
it works. But I find this quite ugly and messy, to put component-specific stuff into the root file.
Conversely, I tried to move the include(CTest) command one level down into the component-specific CMakeLists.txt. But ctest complains with this:
*********************************
No test configuration file found!
*********************************
Is there seriously no way to use ctest with a directory structure like above?
The CTest documentation isn't the clearest.
A project I'm working on has a similar directory structure composed of various units, and in each unit are src and tests subdirectories.
CMake documentation says to call "enable_testing()" at the top-level CMakeLists.txt file, and it further says this command will "enable CTest at the current directory and below." Which sounds recursive, right? Well it's not.
Every CMakeLists.txt must have enable_testing() called to enable automatic CTest discovery in that directory.
Thus, in your project, the toplevel CMakeLists.txt will need enable_testing(), then comp{1,2}/CMakeLists.txt will need it, and finally comp{1,2}/tests/CMakeLists.txt will need it.
Once you add those commands and rerun cmake, those directories will each contain a CTestTestfile.cmake file, which is what the ctest program will look for when it runs.