Cache source code fetched via CMake FetchContent in GitHub Actions - cmake

I've got a CMake-based project in which I heavily rely on FetchContent to retrive multiple libraries (source code). Downloading all the libraries may take time which I want to save when compiling on a CI runner (GitHub Actions).
I saw there's a cache action, how I can use that to cache the fetched source code?
As a bonus, It might be helpful to cache the compiled code of these fetched libraries as well. It that possible somehow?

Related

How to not include source files in my conan package?

We are trying to manage our own C++ static libraries using JFrog Artifactory CE. In the near future, these libraries could be accessed by third parties so we don't want to put any .cpp files in the package, we just want to put .h files and compiled libraries in our conan packages hosted on Artifcatory.
I read through the conan official guide https://docs.conan.io/en/1.3/creating_packages.html
https://docs.conan.io/en/1.3/creating_packages/package_repo.html
but I cannot find any description of how to exclude source files from the recipe.
If I don't specify exports_sources or exports in my conanfile.py I cannot build static libraries but if I specify those parameters, conan puts source files under export/conan_sources.tgz automatically when I execute conan create.
How can I create a conan package without including source files in the recipe?
There are 2 different ways to do this, instead of using the exports_... functionality:
Use the source() method, to retrieve whatever tarball, git-clone, or what is necessary to fetch the sources. This might require some authentication, which can be provided through env-vars. It is typical to use the conandata.yml to put the data there, and let the source() method to read the self.conan_data. Check this docs. The recipes in the conan-center-index repo, that serves to build ConanCenter, uses this approach.
Use the scm component if the recipe lives in the same repo as the source code, to capture the URL and commit of the sources, but without capturing the sources. If the scm code is behind auth, only authorized devs will be able to see the sources or build from sources. Check this section of the docs about SCM
In both cases, if the access to the source is restricted, non-privileged users that try to build packages from sources with --build will fail.

Import targets from external CMakeLists.txt file, having already built the external source

I am trying to integrate the Refinitiv Real-Time SDK into my own application.
I have downloaded the source code and built the libraries.
Typically you would then expect there to be an INSTALL target, which would install the libraries and headers into some location, and then, if we're lucky, a find_package module which we can later use to import the library targets into our own project.
Unfortunately, neither of these are provided.
How then, to import the libraries and their header files into my project?
ExternalProject_Add
I do not want to use the standard ExternalProject_Add to download and build the source code every time I reconfigure my project. (In particular because our CI server will have to do this for every single build.) Rather I want to build it once (and make it part of the CI server's docker image), and then link against the libraries / include the header files directly from where I've copied the source.
add_library INTERFACE
I know that I can create a new INTERFACE library target
find_library(LIB_EMA ema ${REFINITIV_BINARY_DIR})
find_library(LIB_ETA eta ${REFINITIV_BINARY_DIR})
# etc.. for all the refinitiv libraries
add_library(refinitiv INTERFACE)
target_link_libraries(refinitiv INTERFACE
${LIB_EMA}
${LIB_ETA}
# etc...
)
target_include_directories(refinitiv INTERFACE
${REFINITIV_SOURCE_DIR/Ema/Include
${REFINITIV_SOURCE_DIR/Eta/Include
# etc...
)
This is, however, tedious and prone to breaking whenever Refinitiv releases a new SDK version and decides to change a path or link dependency etc
Question:
What I would to do is use their CMakeLists.txt file, but only to access the already-built targets, not to build them as part of my build.
Is this possible?

Install files using symbolic links with CMAKE

I have converted my project to use cmake. During development I'd like to be able to install the product not the normal way, but let (notably) the installed data files be symbolic links to the source tree.
The project is SWI-Prolog which provides functionality to directly navigate and edit source files. If I use this to extend and fix the system libraries however I'm editing the installed copy that I then need to copy back to the sources before I can commit.
I know I can override functions in cmake, but in this case we are dealing with cmake code that is generated.

FindGLM.cmake not in glm 0.9.7, is it a deprecated way to find libraries in CMAKE?

So looking through the newest release of GLM 0.9.7, I dont see a FindGLM.cmake file anywhere, used to easily include GLM in CMAKE. I could always use an old version of it found online but the following commit had me stumped:
https://github.com/g-truc/glm/commit/62a7daddcf082f754000fc5e42d7bcdf93c895f7
Commit message is "Removed obsolete FindGLM". So, did the developer just dump it or are there in fact a new way to find libraries in CMAKE?
Yes, CMake Find modules (FindXyz.cmake files) are deprecated in favour of Package Config files (usually named XyzConfig.cmake). The original philosophy is that Find modules are shipped and maintained by CMake, while Package Config files are shipped and maintained by the package they are intended to find.
CMake's find_package command actually has two modes: Module mode (legacy, using Find modules) and Config mode (preferred, using Package Config files).
For the client consuming the package, little should change (unless more customisation is desired, which is offered by the Config mode of find_package).
Notice that the very commit to which you linked not only drops FindGLM.cmake, but also adds a glmConfig.cmake file.

MSBuild overwriting dependencies

Ok, so I've got a somewhat complicated problem with my build environment that I'm trying to deal with.
I have a solution file that contains multiple C# projects which is built by a NAnt script calling MSBuild - passing MSBuild the name of the solution file and a path to copy the binaries to. This is because I want my automated build environment (CruiseControl.Net) to create a folder named after the revision of each build - this way I can easily go back to previous binaries for any reason.
So idealy I have a folder layout like this
c:\build\nightly\rev1
c:\build\nightly\rev2
c:\build\nightly\rev3
...
c:\build\nightly\rev10
etc.
The problem that's arisen is I recently added the latest version of the Unity IoC container to my project, checking it directly out of MS's online SVN repository. What's happening is I have a Silverlight 3 project that references the Silverlight version of Unity but I also have other projects (namely my Unit testing project) that reference the standard (non-Silverlight) version of Unity.
So what happens is since MSBuild is dumping everything into one single folder the Silverlight version of the Unity assembly is overwriting the non-Silverlight version because they have the exact same assembly file name.
Then when CruistControl runs my unit tests they fail because they don't have the proper dependencies available anymore (they try to load the Silverlight specific Unity assembly which obviously doesn't work).
So what I want to do is:
keep my desired output directory
structure (folder\revision)
I don't want to have to manually edit
every single proj file I have as this
is error prone when adding new
projects to the solution
Idealy I would like MSBuild to put everything into a folder structure similar to this:
nightly\revision1\project1
nightly\revision1\project2
nightly\revision1\project3
...
nightly\revision2\project1
nightly\revision2\project2
nightly\revision2\project3
etc
I can't modify the Unity project to give it a different file name because it comes from another SVN repository I cannot commit changes to. I found a similar question posted here and the suggested solution was to use a "master" MSBuild file that used a custom task to extract all the project file names out of the solution then loop over each one building them. I tried that but it doesn't build them in the order of their dependencies, so it fails for my project.
Help?
Firstly I would always have the build server delete the old working copy and check out a fresh copy to avoid any problems with stale artifacts from the previous build.
Next I would have nant or msbuild build the solutions as before with the artifacts from each build going to their local working output folders.
After that I'd move the artifacts from their working paths to their output paths, this shouldn't require digging through the project files since you can just tell msbuild/nant to copy working\project1\bin\release\**\*.* to artifacts\project1\.
The script that does this should ideally be stored along with the source with the main file, e.g. build.nant or build.proj in top level of the trunk.
For third party libraries I would simple include the DLLs directory in your repository. Nothing worse than writing some code and having a third party dependency break your build because of changes on their end.
Simply document the versions of the libraries you are using, and if you must update them, you'll have a better sense of what breaks the build before you even check it in.
Also, doesn't CC.Net automatically handle the providing of releases based on revision? I'm using TeamCity and it keeps a copy of the artifacts of every build.
I highly recommend reading JP Boodhoo's Automating Builds with NAnt blog series. That's been my starting point and have made lots of changes for my own taste. I also highly recommend checking out the builds of many open sources projects for examples. I've learned a lot from the builds of the Castle/Nhibernate/Rhino-Tools stack.