How to copy a file in Meson to a subdirectory - meson-build

My application uses a Glade file and also cached data in a JSON file. When I do the following, everything works okay as long as the user installs the application with ninja install
#Install cached JSON file
install_data(
join_paths('data', 'dataCache.json'),
install_dir: join_paths('myapp', 'resources')
)
#Install the user interface glade file
install_data(
join_paths('src', 'MainWindow.glade'),
install_dir: join_paths('myapp', 'resources')
)
The downside is that the user needs to install the application. I want the user to be able to just build the application with ninja and run it without installing it if they don't want to install it on their system. The problem is that when I do
#Copy the cached JSON file to the build output directory
configure_file(input : join_paths('data', 'dataCache.json'),
output : join_paths('myapp', 'resources', 'dataCache.json'),
copy: true
)
#Copy the Glade file to the build output directory
configure_file(input : join_paths('src', 'MainWindow.glade'),
output : join_paths('myapp', 'resources', 'MainWindow.glade'),
copy: true
)
I get ERROR: Output file name must not contain a subdirectory.
Is there a way to run ninja and have it create the directories myapp/resources on the build folder and then copy the Glade and JSON file there to be used as resources? Such as to let the user run the application without having to do ninja install?

You can do it by making a script and call it from Meson.
For example, in a file copy.py that takes the relative input and output paths as arguments:
#!/usr/bin/env python3
import os, sys, shutil
# get absolute input and output paths
input_path = os.path.join(
os.getenv('MESON_SOURCE_ROOT'),
os.getenv('MESON_SUBDIR'),
sys.argv[1])
output_path = os.path.join(
os.getenv('MESON_BUILD_ROOT'),
os.getenv('MESON_SUBDIR'),
sys.argv[2])
# make sure destination directory exists
os.makedirs(os.path.dirname(output_path), exist_ok=True)
# and finally copy the file
shutil.copyfile(input_path, output_path)
and then in your meson.build file:
copy = find_program('copy.py')
run_command(
copy,
join_paths('src', 'dataCache.json'),
join_paths('myapp', 'resources', 'dataCache.json')
)
run_command(
copy,
join_paths('src', 'MainWindow.glade'),
join_paths('myapp', 'resources', 'MainWindow.glade')
)

Related

conan doesn't upload export/conanfile.py to remote

I have created customized OpenCV package using conan package manager and uploaded it to a remote storage.
Workflow:
create package
cd c:\path\to\conanfile.py
conan create . smart/4.26 --profile ue4
Export with conan export . opencv-ue4/3.4.0#smart/4.26
Result:
c:\path\> conan export . opencv-ue4/3.4.0#smart/4.26
[HOOK - attribute_checker.py] pre_export(): WARN: Conanfile doesn't have 'url'. It is recommended to add it as attribute
Exporting package recipe
opencv-ue4/3.4.0#smart/4.26 exports_sources: Copied 3 '.patch' files: cmakes.patch, check_function.patch, typedefs.patch
opencv-ue4/3.4.0#smart/4.26: The stored package has not changed
opencv-ue4/3.4.0#smart/4.26: Exported revision: ceee251590f4bf50c4ff48f6dc27c2ed
I upload everything to the remote:
c:\path> conan upload -r bart opencv-ue4/3.4.0#rs7-smart/4.26 --all
Uploading to remote 'bart':
Uploading opencv-ue4/3.4.0#smart/4.26 to remote 'bart'
Recipe is up to date, upload skipped
Uploading package 1/1: 1d79899922d252aec6da136ce61bff640124c1c4 to 'bart'
Uploaded conan_package.tgz -> opencv-ue4/3.4.0#smart/4.26:1d79 [23667.97k]
Uploaded conaninfo.txt -> opencv-ue4/3.4.0#smart/4.26:1d79 [0.75k]
Uploaded conanmanifest.txt -> opencv-ue4/3.4.0#smart/4.26:1d79 [11.81k]
Our remote storage runs on the Artifactory, and I can see in a browser that conanfile.py is not listed anywhere.
I can also verify that directory C:\Users\user\.conan\data\opencv-ue4\3.4.0\smart\4.26\export on my Windows PC does contain both conanfile.py and conanmanifest.txt
I am using Windows PC for doing all above.
Now I'm trying to consume that package on another machine, running Ubuntu Linux.
Here is my conanfile.txt
[requires]
opencv-ue4/3.4.0#smart/4.26
[generators]
json
Command and results
> conan install -g json . opencv-ue4/3.4.0#smart/4.26
Configuration:
[settings]
arch=x86_64
arch_build=x86_64
build_type=Release
compiler=gcc
compiler.libcxx=libstdc++
compiler.version=9
os=Linux
os_build=Linux
[options]
[build_requires]
[env]
opencv-ue4/3.4.0#smart/4.26: Not found in local cache, looking in remotes...
opencv-ue4/3.4.0#smart/4.26: Trying with 'bart'...
Downloading conanmanifest.txt completed [0.33k]
opencv-ue4/3.4.0#-smart/4.26: Downloaded recipe revision 0
ERROR: opencv-ue4/3.4.0#smart/4.26: Cannot load recipe.
Error loading conanfile at '/home/user/.conan/data/opencv-ue4/3.4.0/smart/4.26/export/conanfile.py': /home/user/.conan/data/opencv-ue4/3.4.0/smart/4.26/export/conanfile.py not found!
Running ls -la /home/user/.conan/data/opencv-ue4/3.4.0/smart/4.26/export/ shows that the directory indeed contains only file conanmanifest.txt
Below is the relevant part of the conanfile.py that I've used to build the package
from conans import ConanFile, CMake, tools
class OpenCVUE4Conan(ConanFile):
name = "opencv-ue4"
version = "3.4.0"
url = ""
description = "OpenCV custom build for UE4"
license = "BSD"
settings = "os", "compiler", "build_type", "arch"
generators = "cmake"
exports_sources = 'patches/cmakes.patch', 'patches/check_function.patch', 'patches/typedefs.patch'
def requirements(self):
self.requires("ue4util/ue4#adamrehn/profile")
self.requires("zlib/ue4#adamrehn/{}".format(self.channel))
self.requires("UElibPNG/ue4#adamrehn/{}".format(self.channel))
def cmake_flags(self):
flags = [
"-DOPENCV_ENABLE_NONFREE=OFF",
# cut
]
return flags
def source(self):
self.run("git clone --depth=1 https://github.com/opencv/opencv.git -b {}".format(self.version))
self.run("git clone --depth=1 https://github.com/opencv/opencv_contrib.git -b {}".format(self.version))
def build(self):
# Patch OpenCV to avoid build errors
for p in self.exports_sources:
if p.endswith(".patch"):
tools.patch(base_path='opencv', patch_file=p, fuzz=True)
cmake = CMake(self)
cmake.configure(source_folder="opencv", args=self.cmake_flags())
cmake.build()
cmake.install()
def package_info(self):
self.cpp_info.libs = tools.collect_libs(self)
Conan version both in Windows and in Linux is 1.54.0
How do I correctly upload and consume the package?
Update.
After conversation with #drodri in comments I have removed conanfile.py from exports_sources, deleted all conan-generated files in all PCs and removed uploaded files from the Artifactory.
Then I've rebuilt the package, re-exported and re-uploaded it.
The issue was in restrictions of our Artifactory. Admins have forbidden uploading .py files.

Build executable not found in the path defined in install_dir in meson build file

I am creating a custom target which is an executable created from pyinstaller. I am able to see the executable in the project directory. However, when I add install_dir option, I should see it under /usr/local directory for ubuntu. But I am not able to see this. Can someone suggest what is the right way to achieve this?
bindir = get_option('prefix') + '/' + get_option('bindir')
spec_file = files([meson.current_source_dir() + '/pyinst.spec'])
custom_target('pyinstaller',
output : 'out_exe',
command: [pyinst, '-F', spec_file, '--clean'],
install : false,
install_dir : bindir,
build_by_default: true
)

What's the use of configure_package_config_file option INSTALL_DESTINATION

For the following CMakeLists.txt file
cmake_minimum_required(VERSION 3.15)
project(Testing)
include(CMakePackageConfigHelpers)
configure_package_config_file(FooConfig.cmake.in
${CMAKE_CURRENT_BINARY_DIR}/FooConfig.cmake
INSTALL_DESTINATION lib/Goo/dmake
)
install(FILES ${CMAKE_CURRENT_BINARY_DIR}/FooConfig.cmake
DESTINATION lib/Foo/cmake )
and the FooConfig.cmake.in file
#PACKAGE_INIT#
check_required_components(Foo)
It would finally install FooConfig.cmake to lib/Foo/cmake:
$ cmake ..
$ cmake --build .
$ cmake --install . --prefix ../install/
$ ls -R ../install/
../install/:
lib
../install/lib:
Foo
../install/lib/Foo:
cmake
../install/lib/Foo/cmake:
FooConfig.cmake
It seems that whatever value I set to the INSTALL_DESTINATION option of configure_package_config_file, it won't change FooConfig.cmake's installation directory. But if I comment INSTALL_DESTINATION lib/Goo/dmake, CMake Error prompts
No INSTALL_DESTINATION given to CONFIGURE_PACKAGE_CONFIG_FILE()
Then, what is the use of the INSTALL_DESINATION option? What its value (specifically, the above setting lib/Goo/dmake) actually affects?
The documentation:
The <path> given to INSTALL_DESTINATION must be the destination where the FooConfig.cmake file will be installed to.
I know the right way is INSTALL_DESTINATION lib/Foo/cmake. But as I deliberately set it as lib/Goo/dmake, the FooConfig.cmake file still rests at the desired destination lib/Foo/cmake. So, why this option is designed as a must?
The function configure_package_config_file only generates the file specified as the second argument. This function neither installs the generated file nor affects on its installation. So, its arguments can only affect on the content of the generated file.
If you look into the generated script FooConfig.cmake, then you could find the line like
get_filename_component(PACKAGE_PREFIX_DIR "${CMAKE_CURRENT_LIST_DIR}/../../../" ABSOLUTE)
This is how the script calculates installation prefix based on the path to the FooConfig.cmake file.
E.g. when you install the package with prefix /usr/local and your FooConfig.cmake script will be located in the directory /usr/local/lib/Foo/cmake, then the quoted path will be expanded into /usr/local/lib/Foo/cmake/../../../, which corresponds to /usr/local/.
Number of components ../ in the script equals to the number of components in INSTALL_DESTINATION parameter.
Why 'PACKAGE_PREFIX_DIR' is needed
Assume a package installs all public headers into directory include/ (relative to installation prefix), and config script needs to deliver that directory to the user via variable FOO_INCLUDE_DIR.
The most direct way would be to write in the script following:
# FooConfig.cmake.in
...
set(FOO_INCLUDE_DIR #CMAKE_INSTALL_PREFIX#/include)
so configure_package_config_file would substitute real install prefix instead of #CMAKE_INSTALL_PREFIX#.
But embedding absolute path into the config file prevents the package to be relocatable. So, once installed, the package could be used only from the installation directory, and cannot be copied elsewhere.
For relocatable package a config file computes all installation paths relative to the path of the config file itself, because it is the only path known to the script when it is called by find_package. It is like an executable can compute paths to the files located near to that executable.
If the script is installed into lib/Foo/cmake/FooConfig.cmake, then relative path to the include directory would be ../../../include, so one could use following assignment in that script:
# FooConfig.cmake.in
...
set(FOO_INCLUDE_DIR ${CMAKE_CURRENT_LIST_DIR}/../../../include)
So, when find_package(Foo) will execute this script, it expands the variable CMAKE_CURRENT_LIST_DIR to the actual directory with the possibly relocated script.
Operating with relative paths requires much attention from the coder, so CMake allows to automate this task:
# CMakeLists.txt
...
# Path to the include directory in the installation tree
set(INCLUDE_DIR 'include')
configure_package_config_file(FooConfig.cmake.in
${CMAKE_CURRENT_BINARY_DIR}/FooConfig.cmake
INSTALL_DESTINATION lib/Foo/cmake
# Tell CMake to handle variable `INCLUDE_DIR` as a path
# under installation tree.
PATH_VARS INCLUDE_DIR
)
# FooConfig.cmake.in
#PACKAGE_INIT#
# Here we could use `#PACKAGE_INCLUDE_DIR#` as reference
# to variable 'INCLUDE_DIR' set in the CMakeLists.txt.
set_and_check(FOO_INCLUDE_DIR "#PACKAGE_INCLUDE_DIR#")
If you look into the generated file, you will find that #PACKAGE_INCLUDE_DIR#
is expanded into ${PACKAGE_PREFIX_DIR}/include, which uses the variable PACKAGE_PREFIX_DIR.

How to use CMake to create a package that installs to '/etc' and '/var'?

I've got CMake (3.02) installing to a DESTDIR when I invoke:
$ make -C build DESTDIR=$(pwd)/build/rootfs install
This results in a file layout that I'm happy with: binaries are located in the .../build/rootfs/usr/local/bin directory and the init scripts are in .../build/rootfs/etc/init.d. To accomplish that, I used a mixture of relative and absolute paths in my CMakeLists.txt file:
set(CPACK_SET_DESTDIR ON)
...
INCLUDE(CPack)
...
set(ROOTFS_BIN_DIR bin)
set(ROOTFS_ETC_INITD_DIR /etc/init.d)
...
INSTALL(TARGETS myDaemon DESTINATION ${ROOTFS_BIN_DIR})
INSTALL(PROGRAMS myDaemon.sh RENAME myDaemon DESTINATION ${ROOTFS_ETC_INITD_DIR})
With that, I think 'working', I'm trying to create a simple tarball package which will eventually become a debian package (with pre/post install/remove scripts) but when I invoke:
$ make -C build DESTDIR=$(pwd)/build/rootfs package
I'm getting errors because cpack is attempting to write my init scripts to the system's /etc/init.d directory (instead of $(pwd)/build/rootfs/etc/init.d). If I wanted that, then $sudo !! would solve the problem. The error (replaced full path with ...):
CMake Error at .../build_src/cmdServer/cmake_install.cmake:44 (file):
file INSTALL cannot copy file
".../src/cmdServer/cmdServer.sh" to
"/etc/init.d/cmdServer".
I'm not using a lot of CPACK directives: in my top level CMakeLists.txt file I have:
SET(CPACK_SET_DESTDIR ON)
SET(CPACK_GENERATOR TGZ)
INCLUDE(CPack)
How can I package my init scripts correctly?
I've been referencing:
https://cmake.org/pipermail/cmake/2008-April/020833.html
and https://cmake.org/pipermail/cmake/2006-November/011890.html

How to include the .so of custom ops in the pip wheel and organize the sources of custom ops?

Following the documentation, I put my_op.cc and my_op.cu.cc under tensorflow/core/user_ops, and created tensorflow/core/user_ops/BUILD which contains
load("//tensorflow:tensorflow.bzl", "tf_custom_op_library")
tf_custom_op_library(
name = "my_op.so",
srcs = ["my_op.cc"],
gpu_srcs = ["my_op.cu.cc"],
)
Then I run the following commands under the root of tensorflow:
bazel build -c opt //tensorflow/core/user_ops:all
bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg
After building and installing the pip wheel, I want to use my_op in the project my_project.
I think I should create something like my_project/tf_op/__init__.py and my_project/tf_op/my_op.py, which calls tf.load_op_library like the example cuda_op.py. However, the my_op.so is not included in the installed pip wheel. How can I generate the input (the path of my_op.so) for tf.load_op_library?
Is there any better way to organized my_op.cc, my_op.cu.cc, my_op.py with my_project?
You can preserve directory structure of your project and create setup.py such that it also include .so files. You can also add other non-python files of your project same way.
Example Directory Structure:
my_package
my_project
__init__.py
setup.py
You can install 'my_project' package while in my_package directory using command:
pip install . --user (Avoid --user argument if you install packages with root access)
from setuptools import setup, find_packages
setup(name='my_project',
version='1.0',
description='Project Details',
packages=find_packages(),
include_package_data=True,
package_data = {
'': ['*.so', '*.txt', '*.csv'],
},
zip_safe=False)
Don't forget to add __init__.py in all folders containing python modules you want to import.
Reference: https://docs.python.org/2/distutils/setupscript.html#installing-package-data