Generating compile_commands.json without generating build files - cmake

I'd like to generate a compile_commands.json file for use with the clangd language server. However, EXPORT_COMPILE_COMMANDS only works for the make and ninja build systems. When building a project that uses a different build system it would be convenient to also be able to generate compile_commands.json files as if I was using make or ninja without actually generating any build files that interfere with the build system that I'm using to perform the build.
What is the most convenient way to do this with cmake?

I think your only option here is to have a different build folder with Ninja or Makefile to generates the compile_commands.json and have a different build folder for your "actual" build.
The thing is, CMake is a generator, and it doesn't support mixed builds; and in fact, it should not. If they do that, you will end up having random artifacts from different build systems inside the build folder that might eventually conflicts with each others.
That being said, you are aware that what you get in Ninja-based compile_commands.json is not going to be fully relevant to your "actual" build system that you want to use. I can see it being useful, but not the same for sure.

Related

Is it possible to force CMake to run add_compile_definitions() each time?

I have an embedded project (using ESP-IDF which builds projects with CMake), where I have a props.json file that contains several settings (e.g. "device type"). For example based on the actual value of "deviceType" the CMake open and read props.json by calling execute_process() and jq, then defines C preprocessor macros, such as: DEVICE_TYPE_A by using add_compile_definitions().
The problem is that, this will run only when I modify the CMakeLists.txt or clean the whole project, but I don't want to recompile each components when I change the props.json only the files that I wrote (so, depend on the settings). I'd like to make CMake read the file each time I build the project without cleaning it.
I did my research, so I know there are add_custom_target() and add_custom_command() that behave that way, however add_compile_definitions() cannot be called in a script. Is there a solution to achieve this or should I just use a header file configured by configure_file() and leave add_compile_definitions() alone?
This is actually pretty easy and you don't need to manually reconfigure CMake. Just add the following to the CMakeLists.txt in the directory containing your props.json file:
set_property(DIRECTORY . APPEND PROPERTY CMAKE_CONFIGURE_DEPENDS props.json)
This will add props.json to the list of files that the CMake-generated build scans when determining whether to re-run the CMake configure step. See the docs on CMAKE_CONFIGURE_DEPENDS for more detail.
In general, you should never need to manually re-run CMake1 after the first configure. If you do, it is an indication that you have not communicated all of the necessary information for CMake to generate a correct build system.
1 There is one notable exception: Xcode is known to be buggy when re-running the CMake configure step automatically.

globbing files autogenerated by a sub_directory binary

I have the following problem, I have a separate cmake project that generates c++ files and I want to glob them into a library that uses them. Right now this is done in this manner
add_subdirectory(generator)
add_custom_target(run-generator ... byproduct GENERATED_FILES)
include(files.txt)
target_link_libraries(library ${GENERATED_FILES})
The include files.txt is actually a set(GENERATED_FILES all_autogen_files), right now they're fixed but in the future they might change. That is what I would want to have is something like
add_subdirectory(generator)
execute_process(generator_binary ... commands)
file(glob ${GENERATED_FILES} output_location_of_gen_files)
target_link_libraries(library ${GENERATED_FILES})
As I understood execute_process runs on the spot is read, so this would generate all the files before the file(glob) but I don't know how would I go about actually building the generator binary before the execute process, since right now what builds it before is that it is a dependency on the target_link_libraries
The only way this could possibly work is if you create a superbuild, i.e. a CMake project that builds everything with ExternalProjects. The reason is that you can't create new targets or add sources to existing targets during the build.
With a superbuild you need at least 3 separate CMake projects: One that builds the generator and then generates the files, one that globs the generated files and builds the rest of your build artifacts, and the superbuild project that adds both with ExternalProject_Add. By setting the dependencies correctly you can then ensure that the project that uses the generated files is configured after the generating project has been built.
However, globbing in CMake is discouraged anyway, so listing the files explicitly is the proper way to do it. If your code-generator starts generating new files then they should be added to the list manually in the same commit, as otherwise even with CONFIGURE_DEPENDS it is not guaranteed that the new files will be built when using globbing and the ExternalProject approach.

How to handle autotools project with cmake dependency?

I have an autotools C project that needs to use another library that is built with CMake. Is their an equivalent to AC_CONFIG_SUBDIRS that will work with CMake?
I take it that you want to configure and build the CMake-based project as part of configuring and building the Autotools-based host project. This is possible, and there are several viable ways to do it, but I'm not aware of anything wholly pre-packaged like AC_CONFIG_SUBDIRS is for Autotools-based subprojects.
For configuration
Option 1 - config commands
Autoconf provides a group of macros by which you can specify custom commands for configure or the generated config.status script to run. You could use one of these -- probably AC_CONFIG_COMMANDS, but maybe AC_CONFIG_COMMANDS_POST -- to run cmake (and any wanted preparatory steps) in the subproject. Personally, I like this option best.
Option 2 - glue script
AC_CONFIG_SUBDIRS instructs configure to run configure scripts in the specified subsirectories, but those other configure scripts don't need to be Autotools-generated. You could conceivably write a custom wrapper script named "configure" in the subproject directory for the parent configure to run, but which itself performs an appropriate call to cmake. AC_CONFIG_SUBDIRS in the top-level configuration should then run that script at the right time.
Option 3 - custom code
I think Autoconf already provides sufficient support for what you seem to want, but if you think otherwise then you always have the option of writing whatever shell code you want into configure via configure.ac. You might find it worthwhile to write a custom macro for that, especially if you have multiple CMake subprojects, but that's not obligatory. Note that such commands are distinguished from those specified via AC_CONFIG_COMMANDS & co. by the timing of their execution.
For building
Presumably you'll be relying on recursive make during the build and installation steps. It shouldn't be hard to make that work, whether you're using an Automake-based Makefile.in or a hand-rolled one at the top level.
Option 1 - Automake + glue makefile
Use a SUBDIRS variable in your top-level Makefile.am to direct make to recurse into the CMake project's subdirectory, just as you would do into any other project's. Write a simple Makefile there that recurses into a build subdirectory (which you will have had to ensure is created and configured by configure). This should not collide with the subproject because it presupposes that a separate build directory is used. The glue makefile can adapt targets and make variables to the expectations of the subproject's build system.
The Automake documentation describes all the recursive targets that the top-level Autotools makefile might try to build recursively, and the glue makefile should provide all of them -- though there may be many that need only a dummy (but not empty) recipe.
Option 2 - hand-rolled top-level Makefile.in
If, on the other hand, you're using a hand-rolled top-level Makefile template then you have full control over your recursive make invocations. You could still use a glue makefile in the subproject in this case, but it's probably easier and cleaner to just adapt directly to the expected CMake-generated makefile.

When should I rerun cmake?

After running the cmake command once to generate a build system, when, if ever, should I rerun the cmake command?
The generated build systems can detect changes in the associated CMakeLists.txt files and behave accordingly. You can see the logic for doing so in generated Makefiles. The exact rules for when this will happen successfully are mysterious to me.
When should I rerun cmake? Does the answer depend on the generator used?
This blog post (under heading: "Invoking CMake multiple times") points out the confusion over this issue and states that the answer is actually 'never', regardless of generator, but I find that surprising. Is it true?
The answer is simple:
The cmake binary of course needs to re-run each time you make changes to any build setting, but you wont need to do it by design; hence "never" is correct regarding commands you have to issue.
The build targets created by cmake automatically include checks for each file subsequently [=starting from the main CMakeLists.txt file] involved or included generating the current set of Makefiles/VS projects/whatever. When invoking make (assuming unix here) this automatically triggers a previous execution of cmake if necessary; so your generated projects include logic to invoke cmake itself! As all command-line parameters initially passed (e.g. cmake -DCMAKE_BUILD_TYPE=RELEASE .. will be stored in the CMakeCache.txt, you dont need to re-specify any of those on subsequent invocations, which is why the projects also can just run cmake and know it still does what you intended.
Some more detail:
CMake generates book-keeping files containing all files that were involved in Makefile/Project generation, see e.g. these sample contents of my <binary-dir>/CMakeFiles/Makefile.cmake file using MSYS makefiles:
# The top level Makefile was generated from the following files:
set(CMAKE_MAKEFILE_DEPENDS
"CMakeCache.txt"
"C:/Program Files (x86)/CMake/share/cmake-3.1/Modules/CMakeCCompiler.cmake.in"
"C:/Program Files (x86)/CMake/share/cmake-3.1/Modules/RepositoryInfo.txt.in"
"<my external project bin dir>/release/ep_tmp/IRON-cfgcmd.txt.in"
"../CMakeFindModuleWrappers/FindBLAS.cmake"
"../CMakeFindModuleWrappers/FindLAPACK.cmake"
"../CMakeLists.txt"
"../CMakeScripts/CreateLocalConfig.cmake"
"../Config/Variables.cmake"
"../Dependencies.cmake"
"CMakeFiles/3.1.0/CMakeCCompiler.cmake"
"CMakeFiles/3.1.0/CMakeRCCompiler.cmake")
Any modification to any of these files will trigger another cmake run whenever you choose to start a build of a target. I honestly dont know how fine-grained those dependencies tracking goes in CMake, i.e. if a target will just be build if any changes somewhere else wont affect the target's compilation. I wouldn't expect it as this can get messy quite quickly, and repeated CMake runs (correctly using the Cache capabilities) are very fast anyways.
The only case where you need to re-run cmake is when you change the compiler after you started a project(MyProject); but even this case is handled by newer CMake versions automatically now (with some yelling :-)).
additional comment responding to comments:
There are cases where you will need to manually re-run cmake, and that is whenever you write your configure scripts so badly that cmake cannot possibly detect files/dependencies you're creating. A typical scenario would be that your first cmake run creates files using e.g. execute_process and you would then include them using file(GLOB ..). This is BAD style and the CMake Docs for file explicitly say
Note: We do not recommend using GLOB to collect a list of source files from your source tree. If no CMakeLists.txt file changes when a source is added or removed then the generated build system cannot know when to ask CMake to regenerate.
Btw this comment also sheds light on the above explained self-invocation by the generated build system :-)
The "proper" way to treat this kind of situations where you create source files during configure time is to use add_custom_command(OUTPUT ...), so that CMake is "aware" of a file being generated and tracks changes correctly. If for some reason you can't/won't use add_custom_command, you can still let CMake know of your file generation using the source file property GENERATED. Any source file with this flag set can be hard-coded into target source files and CMake wont complain about missing files at configure time (and expects this file to be generated some time during the (first!) cmake run.
Looking into this topic for reading the version information from a debian/changelog file (generation phase), I ran in the topic that cmake execution should be triggered as debian/changelog is modified. So I had the need to add debian/changelog to CMAKE_MAKEFILE_DEPENDS.
In my case, debian/changelog is read through execute_process. Execute_process unfortunately gives no possibility to add files processed to CMAKE_MAKEFILE_DEPENDS. But I found that running configure_file will do it. Actually I am really missing something like DEPENDENCIES in execute_process.
However, as I had the need to configure the debian/changelog file for my needs, the solution came implicitly to me.
I actually also found a hint about this in the official documentation of configure_file:
"If the input file is modified the build system will re-run CMake to re-configure the file and generate the build system again."
So using configure_file should be a safe to trigger the re-run of cmake.
From a user perspective, I would expect other commands to extend CMAKE_MAKEFILE_DEPENDS, too. E.g. execute_process (on demand) but also file(READ) (implicitly like configure_file). Perhaps there are others. Each read file is likely to influence the generation phase. As an alternative it would be nice to have a command to just extend the dependency list (hint for the cmake developers, perhaps one comes along).

Is it possible to build binaries for different targets using CMake?

I'm considering to use CMake for projects targeting a microcontroller. I found out how to create a toolchain file and invoke cmake -DCMAKE_TOOLCHAIN_FILE=Path/To/Toolchain.cmake to make CMake do cross-compiling.
However most projects that I work on have also code that must be compiled for the host platform. These are often unit tests or other test tools, which share most part of their code with the binary that will run on the microcontroller. A rare case might be a project that even has two processors having a different instruction architectures, thus needing a host compiler and two different cross compilers.
I'd like to have one build that rules them all. Is it possible to have a construction that I only need to call cmake /path/to/source && make, or is the only solution having multiple 'root' CMakeList.txt files, each for every target?
Each cmake run will target one specific generator and thus one platform.
What you want can be achieved by having one hierarchy of CMakeLists files for each platform. You need to get to a point where doing a succession of cmake .. && make calls will build the whole project.
Then write a master CMakeLists that executes all of those separate build steps for you, e.g. through ExternalProject_Add or by using custom commands. Depending on the structure of your project it might make sense to have only the tools required for building being processed this way and add the sources for the actual project directly to the master CMakeLists instead.