I have big project with cmake. It mostly works.
But recently some combination of compilation server vs test server broke. Investigation found that final compile/link command calls gcc (...) -licudata -licui18n -licuuc (...), this introduces dependency on shared library which is not present on test server.
How do I find out what in my project (my library, imported library, found library, whatever) adds those 3 flags to compile command?
I don't add them explicitly, so something is done automagically and I want to find it. compile_commands.json doesn't have them because linking flags don't belong in it. CMakeCache.txt has those flags in some obscure variable PC_LIBXML_STATIC_LIBRARIES:INTERNAL but removing them there doesn't affect compile/link command.
Note that this question is not about dealing with libicu specifically but about a method for investigation in general (though comments about eventual known problems with libicu would be appreciated too).
I found out that dependency graphs created by cmake can have more details that was configured for our project. Here are all options: https://cmake.org/cmake/help/latest/module/CMakeGraphVizOptions.html I expect GRAPHVIZ_EXTERNAL_LIBS, GRAPHVIZ_SHARED_LIBS are most important to set to true.
We enabled everything that was possible to enable, filtered out nothing and resulting graph was massive (to big for xdot - luckily .dot files are human readable), but showed that Boost::regex uses those 3 libraries.
Context
I'd like to run a Qt application on an IM6 based system with a Yocto built image. I already have the meta-qt5 layer in my project. I started with a simple Qt5 application that only neededs the following modules:
QT += core gui widgets
All I have to do is make sure my bitbake recipe has DEPENDS += qtbase and is based on the qmake class with: inherit qmake5. And it builds and runs on the target! No problem
Problem
Now I'd like to add another Qt5 application, this time with the following modules and one plugin:
QT += core gui widgets quick qml svg xml network charts
QTPLUGIN += qsvg
Unfortunately, I'm not able to simple add these to my DEPENDS variable and get it to work. But googling around for how to add support reveals what seems to be a sprawling assortment of solutions. I'll enumerate what I've found here:
I need to add inherit populate_sdk_qt5 to instruct bitbake to build the recipe against the SDK that contains the libraries for the modules (see here)
I need to add IMAGE_FEATURES += dev-pkgs to the recipe (see here)
I need to modify local.conf for the system, and add lines like: PACKAGECONFIG_append_pn_qttools = "..." and also PACKAGECONFIG_append_pn-qtbase = "..."
I need to modify layer.conf in my layer and add things like IMAGE_INSTALL_append = "qtbase qtquick ..." (slide 53 here)
I need to manually patch the Qt5 toolchain for charts? (see here)
I need to compile my image using bitbake <target> -c populate_sdk? (see here again)
At this point, I'm really unsure what exactly is going on. It seems we're modifying the recipe, the layer configuration file, the distribution configuration file, and even meta-Qt layer files. I know that I fundamentally need to do a few things:
Compile the application against the Qt5 SDK
Compile the needed plugins + modules for the target architecture
Make sure the appropriate binaries (images) are copied to the target.
But it has become a bit unclear about what does what. I know that IMAGE_INSTALL_append adds images to the target, but I am lost with what is the proper way to add the modules. I don't want to go about randomly adding lines, so I'm hoping someone can clear up a bit what exactly I need to be looking at in order to add support for a Qt5 module for an application.
There are different problems stated, your preferred way seems to the directly building a recipe, not using the toolchain. So, you need the image to have the tools you need.
First of all qtsvg is not on Qt Base, it is a module so you need it installed.
Add Qt SVG support
You need Qt SVG on target in order to run your App. Either to your image or to local.conf you need
IMAGE_INSTALL_append = " qtsvg"
As a fact, your app's recipe needs Qt QSVG so you need to DEPEND on it on your app's recipe like this:
DEPENDS = "qtsvg"
here qtsvg is the name of the other recipe, namely qtsvg_git.bb, not to be confused with the identically named qtsvg plugin. And it will get pulled automatically on build time on your development machine, otherwise it won't even build.
Remember yocto creates a simulated image tree on the TMP folder in order to build (yes it does it for each recipe), so you must describe what your recipe needs or it won't find it and your build will fail.
You can also check the recipe for a Qt5 example as it also has DEPENDS and RDEPENDS. And you can get more info on those in here.
From the docs page:
CMAKE_BUILD_TYPE
Specifies the build type on single-configuration generators.
This statically specifies what build type (configuration) will be built in this build tree. Possible values are empty, Debug, Release, RelWithDebInfo and MinSizeRel. This variable is only meaningful to single-configuration generators (such as Makefile Generators and Ninja) i.e. those which choose a single configuration when CMake runs to generate a build tree as opposed to multi-configuration generators which offer selection of the build configuration within the generated build environment. There are many per-config properties and variables (usually following clean SOME_VAR_<CONFIG> order conventions), such as CMAKE_C_FLAGS_<CONFIG>, specified as uppercase: CMAKE_C_FLAGS_[DEBUG|RELEASE|RELWITHDEBINFO|MINSIZEREL]. For example, in a build tree configured to build type Debug, CMake will see to having CMAKE_C_FLAGS_DEBUG settings get added to the CMAKE_C_FLAGS settings. See also CMAKE_CONFIGURATION_TYPES.
I'm aware the differences between Debug builds and Release builds, but what are the differences between Release, RelWithDebInfo and MinSizeRel? I'm guessing RelWithDebInfo meant creating debuggable binaries, and MinSizeRel meant creating smallest possible size binaries.
From the LLVM CMake page:
CMAKE_BUILD_TYPE:STRING
If you are using an IDE such as Visual Studio, you should use the IDE settings to set the build type. Be aware that Release and RelWithDebInfo use different optimization levels on most platforms.
If I want to generate a production build, should I choose Release?
IMPORTANT: CMAKE_BUILD_TYPE only makes sense for single-target generators, like Makefiles. It is not used for multi-target generators as those simply generate a build system capable of building all build types (debug, release, etc).
CMAKE_BUILD_TYPE is about,
Optimization (level) [-O0, -O1, -O2, -O3, -Ofast, -Os, -Oz, -Og, -O, -O4]
Including 'debug info' in the executable [-g, -gline-tables-only, -gmodules, -glevel, -gcoff, -gdwarf, -gdwarf-version, -ggdb, -grecord-gcc-switches, -gno-record-gcc-switches, -gstabs, -gstabs+, -gstrict-dwarf, -gno-strict-dwarf, -gcolumn-info, -gno-column-info, -gvms, -gxcoff, -gxcoff+, -gz[=type]]
Generating code for assert() or not [-DNDEBUG]
Including debug (output) code or not [custom]
Most such compiler options are compiler and/or platform specific. So, extended support for a build type needs updating every existing tool chain that you want to support.
The default build types that come with cmake more or less mean the following,
1. Release: high optimization level, no debug info, code or asserts.
2. Debug: No optimization, asserts enabled, [custom debug (output) code enabled],
debug info included in executable (so you can step through the code with a
debugger and have address to source-file:line-number translation).
3. RelWithDebInfo: optimized, *with* debug info, but no debug (output) code or asserts.
4. MinSizeRel: same as Release but optimizing for size rather than speed.
In terms of compiler flags that usually means (since these are supported in most cases on all platforms anyway):
1. Release: `-O3 -DNDEBUG`
2. Debug: `-O0 -g`
3. RelWithDebInfo: `-O2 -g -DNDEBUG`
4. MinSizeRel: `-Os -DNDEBUG`
Where defining NDEBUG is added on platforms that support this (it disables assert()). This is why you should make sure that none of your asserts have side effects, of course.
Extending the build type
Although adding stuff that needs different options for different tool chains is not something you really want to do in general (although, compiler options are basically compiler/language specific, so you could rather easily check the compiler ID if you wanted and then pick your flags depending on that).
It is rather easy to add support in the form of altering optimization flags or debug flags when you restrict yourself to [-g, -O0, -O2, -O3and-Os], removing a possible -DNDEBUG flag and/or adding custom macros.
Suppose we have a macro DEBUG that we want to define to include specific debug code (which could include writing debug output for example).
Then we have four optimization levels, debug info or not, assert code or not and debug code or not, for a total of 4 * 2 * 2 * 2 = 32 configurations (build types). But clearly not all configurations are very practical. It is better to look at what the use case is for a configuration.
Clearly we have the Release build, which is bug-free code that is released at large; it is production code. You will not compile it very often and when you do it is more important that the resulting code is fast (or small?) than that it matters how long it takes to compile it. That leads to the two existing build types for production code:
1. Release
2. MinSizeRel
But then it turns out there is a bug after all in the production code that makes the application crash. You can't reproduce it and it only happens sometimes. You implemented a feedback mechanism for your users to send you the core dump, but the info just isn't enough. You want to get the stack trace in the hope it will tell you more. You ask certain users (or maybe yourself, using it on a daily basis as 'user') to download a special version that is usable as normal (it is fast enough, optimized) but it has debug information included, so takes a lot longer to download. Those users don't mind that: they want this crash to be fixed. This is supported with
3. RelWithDebInfo
Of course, you as the developer need a version that you can step through with a debugger. It doesn't have to be fast - you already know how to reproduce a bug that doesn't depend on optimization (it is a logic bug, a problem in your code - not a Heisenbug). For this you use,
4. Debug
But -- you also have beta testers (maybe you yourself on a daily basis using the program as a 'user'). In this case you want the code to be optimized, so it is fast enough - but you want also all asserts to be turned on; an assert might tell you where a problem is way better than a core dump that happens later. Or worse, it might just behave strangely and not crash at all. You need to be sure that none of your asserts fire, even in production code. That is what beta testers are for. Lets call this build type,
5. BetaTest [`-O3 -g`] - aka Release minus the `-DNDEBUG` but with `-g`.
Finally, there debug builds that are not to step through with a debugger; some bugs are simply not reproducable, nor does a stack trace help (because either it doesn't core dump, or the problem isn't causing an immediate crash). There are many, if not most, such bugs. And the only way to find those bugs (once they occur) is with extra debug code and/or loads of debug output written to a log file. You want this code to be compiled with -O2 at least, but you want asserts on too (why not) and you need the macro DEBUG to be defined. We might as well include debug info too, because the size of the executable is of lesser concern here, of course. Lets call such a build
6. RelWithDebug [`-O2 -g -DDEBUG`] - aka RelWithDebInfo but `-DNDEBUG` removed and `-DDEBUG` added.
I suggest -O2 here because this is what you'd compile with most, as developer, because you yourself will always be such a user, because IF something unexpected happens you want to know what caused it (have those logs!) and you don't want to compile with the much slower -O3 all the time...
To support these two extra build types we need to be able to do two
things therefore: get the flags of an existing build type (change them) and use those flags for a new (custom) build type.
Here is how to do this
If you add the following four lines somewhere to the top of your project roots CMakeLists.txt file then using -DCMAKE_BUILD_TYPE=BetaTest (or RelWithDebug), will use the flags as outlined above. Of course you might have to make more changes if anything else in your .cmake files depends on the build type. Here is an example of what I am using personally: CW_OPTIONS.cmake (look, case insensive for betatest and relwithdebug plus the variables that are set as a result of what values those two have).
string(REGEX REPLACE "( -DNDEBUG$|-DNDEBUG )" "" CMAKE_CXX_FLAGS_BETATEST "${CMAKE_CXX_FLAGS_RELEASE}" )
string(REGEX REPLACE "( -DNDEBUG$|-DNDEBUG )" "" CMAKE_C_FLAGS_BETATEST "${CMAKE_C_FLAGS_RELEASE}" )
string(REGEX REPLACE "-DNDEBUG " "" CMAKE_CXX_FLAGS_RELWITHDEBUG "${CMAKE_CXX_FLAGS_RELWITHDEBINFO} -DDEBUG" )
string(REGEX REPLACE "-DNDEBUG " "" CMAKE_C_FLAGS_RELWITHDEBUG "${CMAKE_C_FLAGS_RELWITHDEBINFO} -DDEBUG" )
RelWithDebInfo is the same as Release, allowing you to have symbol files for debugging.
For example in Visual Studio, you'll have .pdb files and without them, it'll be hard to debug because all the signatures in the binary files are not going to be human readable and there's no way to map them to the source code.
MinSizeRel is the same as Release, with its optimization configuration just set to Minimize Size instead of Maximize Speed as illustrated in this link in Visual Studio, for example.
If I want to generate a production build, should I choose Release?
Yes, that should do the right job for you. Debug/Release are the most commonly used options.
Reading this CMAKE FAQ will actually help you a lot.
Yes, you are correct:
I'm guessing RelWithDebInfo meant creating debuggable binaries, and MinSizeRel meant creating smallest possible size binaries.
RelWithDebInfo will add compiler flags for generating debug information (the -g flag for GCC / clang), and will result in debuggable, yet much larger binaries.
MinSizeRel will add compiler flags for generating more compact binaries (the -Os flag for GCC / clang), possibly on the expense of program speed.
If I want to generate a production build, should I choose Release?
Yes, Release would be a good choice. It should produce faster binaries, by specifying compiler optimization level for favoring speed (-O3 for GCC / clang), and not including debug symbols.
I'm writing a custom check for installed libraries in autoconf:
AC_DEFUN([AC_GHC_PKG_CHECK],[
...
GHC_PKG_RESULT=$($PYTHON autotools/check-ghc-version-range ....)
...
])
where my Python script that actually performs the check resides in the autotools/ sub-directory of the project.
However, this is not portable, for example make dist-check fails because then autoconf tools are called from a different directory. How can I reference the absolute path to my Python script so that it gets called properly no matter what the current directory is?
ac_top_srcdir or ac_abs_top_srcdir should work in this case:
AC_DEFUN([AC_GHC_PKG_CHECK],[
...
GHC_PKG_RESULT=$($PYTHON $ac_top_srcdir/autotools/check-ghc-version-range ....)
...
])
EDIT: I don't think this approach will work -- it seems that $ac_top_srcdir aren't evaluated until later (AC_OUTPUT?).
What I think might work in this instance is to do something similar to what the runtime C tests do: blast a configuration test to a temporary file (conftest.py instead of conftest.c in this case) and run it. Unfortunately, there's (yet) no builtin macros or for automake/autoconf other tools that directly assist with this task.
Fortunately it seems that a clever person has written at least a couple different ways to do this. The first one is GNU pyconfigure which seems to have facilities for writing Python test code as I described above. The second one is more of an ad hoc macro collection that he used for his project.
You can use $srcdir.
It's not necessarily an absolute path, but it's a path that points from the top of the build tree to the top of the source tree.
I am trying to clean up some of my projects, and one of the things that are puzzling me is how to deal with header files in static libraries that I have added as "project dependencies" (by adding the project file itself). The basic structure is like this:
MyProject.xcodeproj
Contrib
thirdPartyLibrary.xcodeproj
Classes
MyClass1.h
MyClass1.m
...
Now, the dependencies are all set up and built correctly, but how can I specify the public headers for "thirdPartyLibrary.xcodeproj" so that they are on the search path when building MyProject.xcodeproj. Right now, I have hard-coded the include directory in the thirdPartyLibrary.xcodeproj, but obviously this is clumsy and non-portable. I assume that, since the headers are public and already built to some temporary location in ~/Library (where the .a file goes as well), there is a neat way to reference this directory. Only.. how? An hour of Googling turned up blank, so any help is greatly appreciated!
If I understand correctly, I believe you want to add a path containing $(BUILT_PRODUCTS_DIR) to the HEADER_SEARCH_PATHS in your projects build settings.
As an example, I took an existing iOS project which contains a static library, which is included just in the way you describe, and set the libraries header files to public. I also noted that the PUBLIC_HEADERS_FOLDER_PATH for this project was set to "/usr/local/include" and these files are copied to $(BUILT_PRODUCTS_DIR)/usr/local/include when the parent project builds the dependent project. So, the solution was to add $(BUILT_PRODUCTS_DIR)/usr/local/include to HEADER_SEARCH_PATHS in my project's build settings.
HEADER_SEARCH_PATHS = $(BUILT_PRODUCTS_DIR)/usr/local/include
Your situation may be slightly different but the exact path your looking for can probably be found in Xcode's build settings. Also you may find it helpful to add a Run Script build phase to your target and note the values of various settings at build time with something like:
echo "BUILT_PRODUCTS_DIR " $BUILT_PRODUCTS_DIR
echo "HEADER_SEARCH_PATHS " $HEADER_SEARCH_PATHS
echo "PUBLIC_HEADERS_FOLDER_PATH " $PUBLIC_HEADERS_FOLDER_PATH
.
.
.
etc.
I think that your solution is sufficient and a generally accepted one. One alternative would be to have all header files located under an umbrella directory that can describe the interface to using the depended-on libraries and put that in your include path. I see this as being similar to /usr/include. Another alternative that I have never personally tried, but I think would work would be to create references to all the headers of thirdPartyLibrary from MyProject so that they appear to be members of the MyProject. You would do this by dragging them from some location into MyProject, and then deselecting the checkbox that says to copy them into the project's top level directory. From one perspective this seems feasible to me because it is as if you are explicitly declaring that your project depends on those specific classes, but it is not directly responsible for compiling them.
One of the things to be wary of when addressing this issue is depending on implementation-specific details of Xcode for locating libraries automatically. Doing so may seem innocuous in the meantime but the workflows that it uses to build projects are subject to change with updates and could potentially break your project in subtle and confusing ways. If they are not well-defined in some documentation, I would take any effect as being coincidental and not worth leveraging in your project when you can enforce the desired behavior by some other means. In the end, you may have to define a convention that you follow or find one that you adopt from someone else. By doing so, you can rest assured that if your solution is documented and reproducible, any developer (including yourself in the future) can pick it up and proceed without tripping over it, and that it will stand the testament of time.
The way we do it is to go into build target settings for the main project and add:
User Header Search Path = "Contrib"
and check that it searches recursively. We don't see performance problems with searching recursively even with many (10-15 in some projects) dependencies.