Using conan to package multiple configurations of preexisting binaries - conan

I have a set of third-party binaries that I am trying to put into a conan package. The binaries are in folders for the build configuration: Linux32, Win32, Win64, Win32.
I have been able to produce a conan package for the Win64 configuration using the following conanfile.py:
from conans import ConanFile
class LibNameConan(ConanFile):
name = "LibName"
version = "1.1.1"
settings = "os", "compiler", "build_type", "arch"
description = "Package for LibName"
url = "None"
license = "None"
def package(self):
self.copy("*", dst="lib", src="lib")
self.copy("*.c", dst="include", src="include", keep_path=False)
def package_info(self):
self.cpp_info.libs = self.collect_libs()
I run the following commands in powershell:
conan install
mkdir pkg
cd pkg
conan package .. --build_folder=../
cd ..
conan export name/testing
conan package_files libname/1.1.1#name/testing
For the Win64 this works as expected. When I repeat the steps with Win32 binaries I do not get a different hash for the package.
I have tried running:
conan install -s arch=x86
However, this still results in the package having the same hash as the x86_64 configuration.
How is the configuration supposed to be set for generating a package from preexisting binaries?

If you are just packaging pre-built binaries, you are fine without the package() method, that is only relevant when building from the recipe:
from conans import ConanFile
class LibNameConan(ConanFile):
name = "LibName"
version = "1.1.1"
settings = "os", "compiler", "build_type", "arch"
description = "Package for LibName"
url = "None"
license = "None"
def package_info(self):
self.cpp_info.libs = self.collect_libs()
Unless there is some important reason you want to package the sources too, do you want them to be able to debug your dependencies too? In that case, please condition it to the build_type.
However this could be mostly irrelevant for your question. As your package doesn't have dependencies and you are not using any generator either, you don't need a conan install, and the settings you use there, have no effect.
You have to specify the settings for your binary configuration when you package_files:
$ conan package_files libname/1.1.1#name/testing # using your default config
$ conan package_files libname/1.1.1#name/testing -s arch=x86 # 32 bits instead of 64
...
Probably the recommended way is to use profiles:
$ conan package_files libname/1.1.1#name/testing # using your default profile
$ conan package_files libname/1.1.1#name/testing -pr=myprofile2
The documentation got recently a re-write, you might want to check: https://docs.conan.io/en/latest/creating_packages/existing_binaries.html

Related

conan doesn't upload export/conanfile.py to remote

I have created customized OpenCV package using conan package manager and uploaded it to a remote storage.
Workflow:
create package
cd c:\path\to\conanfile.py
conan create . smart/4.26 --profile ue4
Export with conan export . opencv-ue4/3.4.0#smart/4.26
Result:
c:\path\> conan export . opencv-ue4/3.4.0#smart/4.26
[HOOK - attribute_checker.py] pre_export(): WARN: Conanfile doesn't have 'url'. It is recommended to add it as attribute
Exporting package recipe
opencv-ue4/3.4.0#smart/4.26 exports_sources: Copied 3 '.patch' files: cmakes.patch, check_function.patch, typedefs.patch
opencv-ue4/3.4.0#smart/4.26: The stored package has not changed
opencv-ue4/3.4.0#smart/4.26: Exported revision: ceee251590f4bf50c4ff48f6dc27c2ed
I upload everything to the remote:
c:\path> conan upload -r bart opencv-ue4/3.4.0#rs7-smart/4.26 --all
Uploading to remote 'bart':
Uploading opencv-ue4/3.4.0#smart/4.26 to remote 'bart'
Recipe is up to date, upload skipped
Uploading package 1/1: 1d79899922d252aec6da136ce61bff640124c1c4 to 'bart'
Uploaded conan_package.tgz -> opencv-ue4/3.4.0#smart/4.26:1d79 [23667.97k]
Uploaded conaninfo.txt -> opencv-ue4/3.4.0#smart/4.26:1d79 [0.75k]
Uploaded conanmanifest.txt -> opencv-ue4/3.4.0#smart/4.26:1d79 [11.81k]
Our remote storage runs on the Artifactory, and I can see in a browser that conanfile.py is not listed anywhere.
I can also verify that directory C:\Users\user\.conan\data\opencv-ue4\3.4.0\smart\4.26\export on my Windows PC does contain both conanfile.py and conanmanifest.txt
I am using Windows PC for doing all above.
Now I'm trying to consume that package on another machine, running Ubuntu Linux.
Here is my conanfile.txt
[requires]
opencv-ue4/3.4.0#smart/4.26
[generators]
json
Command and results
> conan install -g json . opencv-ue4/3.4.0#smart/4.26
Configuration:
[settings]
arch=x86_64
arch_build=x86_64
build_type=Release
compiler=gcc
compiler.libcxx=libstdc++
compiler.version=9
os=Linux
os_build=Linux
[options]
[build_requires]
[env]
opencv-ue4/3.4.0#smart/4.26: Not found in local cache, looking in remotes...
opencv-ue4/3.4.0#smart/4.26: Trying with 'bart'...
Downloading conanmanifest.txt completed [0.33k]
opencv-ue4/3.4.0#-smart/4.26: Downloaded recipe revision 0
ERROR: opencv-ue4/3.4.0#smart/4.26: Cannot load recipe.
Error loading conanfile at '/home/user/.conan/data/opencv-ue4/3.4.0/smart/4.26/export/conanfile.py': /home/user/.conan/data/opencv-ue4/3.4.0/smart/4.26/export/conanfile.py not found!
Running ls -la /home/user/.conan/data/opencv-ue4/3.4.0/smart/4.26/export/ shows that the directory indeed contains only file conanmanifest.txt
Below is the relevant part of the conanfile.py that I've used to build the package
from conans import ConanFile, CMake, tools
class OpenCVUE4Conan(ConanFile):
name = "opencv-ue4"
version = "3.4.0"
url = ""
description = "OpenCV custom build for UE4"
license = "BSD"
settings = "os", "compiler", "build_type", "arch"
generators = "cmake"
exports_sources = 'patches/cmakes.patch', 'patches/check_function.patch', 'patches/typedefs.patch'
def requirements(self):
self.requires("ue4util/ue4#adamrehn/profile")
self.requires("zlib/ue4#adamrehn/{}".format(self.channel))
self.requires("UElibPNG/ue4#adamrehn/{}".format(self.channel))
def cmake_flags(self):
flags = [
"-DOPENCV_ENABLE_NONFREE=OFF",
# cut
]
return flags
def source(self):
self.run("git clone --depth=1 https://github.com/opencv/opencv.git -b {}".format(self.version))
self.run("git clone --depth=1 https://github.com/opencv/opencv_contrib.git -b {}".format(self.version))
def build(self):
# Patch OpenCV to avoid build errors
for p in self.exports_sources:
if p.endswith(".patch"):
tools.patch(base_path='opencv', patch_file=p, fuzz=True)
cmake = CMake(self)
cmake.configure(source_folder="opencv", args=self.cmake_flags())
cmake.build()
cmake.install()
def package_info(self):
self.cpp_info.libs = tools.collect_libs(self)
Conan version both in Windows and in Linux is 1.54.0
How do I correctly upload and consume the package?
Update.
After conversation with #drodri in comments I have removed conanfile.py from exports_sources, deleted all conan-generated files in all PCs and removed uploaded files from the Artifactory.
Then I've rebuilt the package, re-exported and re-uploaded it.
The issue was in restrictions of our Artifactory. Admins have forbidden uploading .py files.

Conan WSL install: option 'pybind' doesn't exist

Update:
I decided to build the makefiles to build the libraries by hand. That worked. I am leaving this up in case someone has a suggestion as to how to get conan to work.
I am trying to install some libraries in WSL Linux using conan.
One of the libraries is here: https://github.com/Aquaveo/xmscore
I have installed conan, cmake, and xmsconan (see Building-Libraries below).
I am installing xmscore using these instructions:
https://github.com/Aquaveo/xmscore/wiki/Building-Libraries
When I run the command:
conan install -pr ../dev/xmsprofile_debug ..
I get the following error message:
Configuration:
[settings]
arch=x86_64
arch_build=x86_64
build_type=Debug
compiler=gcc
compiler.version=9.4
cppstd=17
os=Linux
os_build=Linux
[options]
xmscore:pybind=False
xmscore:xms=True
[build_requires]
*: pybind11/2.10.0
[env]
ERROR: /home/stboerne/Programming/ThirdParty/xmscore/conanfile.py: Error while initializing options. option 'pybind' doesn't exist
Possible options are []
I tried some of the options from here (Cmake: using conan pybind11 package), but nothing here seems to work, and I am too much of a novice to conan.
The command:
conan search pybind11 -r=all
produces the following output:
Existing package recipes:
Remote 'conancenter':
pybind11/2.4.3
pybind11/2.5.0
pybind11/2.6.0
pybind11/2.6.1
pybind11/2.6.2
pybind11/2.7.0
pybind11/2.7.1
pybind11/2.8.1
pybind11/2.9.1
pybind11/2.9.2
pybind11/2.10.0
Any suggestions?
TIA

How to link against to specific packageid in Conan

I have a package with 2 binaries. The binaries only differ by a single option.
The package is a library. How can now link this package to a specific package that I require?
There are few options:
You can use cmake_paths and discover the library name:
First, you add a conanfile.txt, declaring package name and cmake_paths generator
[requires]
my_package/0.1.0#user/channel
[generators]
cmake_paths
Second, you search the desired library from the package, by its name in your cmake file:
cmake_minimum_required(VERSION 3.0)
project(myapp)
find_library(MY_LIBRARY foo REQUIRED) # the library name foo in this example
add_executable(myapp app.cpp)
target_link_libraries (myapp ${MY_LIBRARY})
And finally, you pass the cmake_paths to cmake, so it will find your library
mkdir build && cd build
conan install ..
cmake .. -DCMAKE_TOOLCHAIN_FILE=conan_paths.cmake
cmake --build .
./myapp
It works, but it's a bit fragile, as the consumer need to now the library name and there is no warning from CMake when the library is found, you have to add a condition to check it.
The second possible option is using Components feature, but requires a recipe modification, you can provide a different target for each library:
First, you need to update your conanfile.py, adding the components:
from conans import ConanFile, CMake
class MyPackage(ConanFile):
name = "my_package"
version = "0.1.0"
settings = "os", "compiler", "build_type", "arch"
options = {"shared": [True, False]}
default_options = {"shared": False}
generators = "cmake"
exports_sources = "src/*"
def build(self):
cmake = CMake(self)
cmake.configure(source_folder="src")
cmake.build()
def package(self):
self.copy("*.h", dst="include", src="src")
self.copy("*.lib", dst="lib", keep_path=False)
self.copy("*.dll", dst="bin", keep_path=False)
self.copy("*.dylib*", dst="lib", keep_path=False)
self.copy("*.so", dst="lib", keep_path=False)
self.copy("*.a", dst="lib", keep_path=False)
def package_info(self):
self.cpp_info.names["cmake_find_package"] = "MyPackage"
self.cpp_info.names["cmake_find_package_multi"] = "MyPackage"
self.cpp_info.components["libfoo"].names["cmake_find_package"] = "foo"
self.cpp_info.components["libfoo"].names["cmake_find_package_multi"] = "foo"
self.cpp_info.components["libfoo"].libs = ["foo"]
self.cpp_info.components["libbar"].names["cmake_find_package"] = "bar"
self.cpp_info.components["libbar"].names["cmake_find_package_multi"] = "bar"
self.cpp_info.components["libbar"].libs = ["bar"]
As you can see, the recipe builds a package with 2 libraries, foo and bar, and use different components for each one. As CamelCase is preferred for CMake targets, the name MyPackage will be used for the file name and target namespace. The result will be MyPackage::foo and MyPackage::bar.
As consumer, you have to update your project too:
Now we will use cmake_find_package generator, to avoid CMAKE_TOOLCHAIN_FILE in the command line:
[requires]
my_package/0.1.0#user/channel
[generators]
cmake_find_package
The CMake file now requires the package name, but it's checked by CMake:
cmake_minimum_required(VERSION 3.0)
project(myapp)
find_package(MyPackage REQUIRED)
add_executable(myapp app.cpp)
target_link_libraries (myapp MyPackage::foo) # We only need libfoo here
And finally, but simpler now, the command line:
mkdir build && cd build
conan install ..
cmake ..
cmake --build .
./myapp
Conan will generate FindMyPackage.cmake in build/ which will be loaded by your CMakeLists.txt.
Both demonstrations achieve what you asked, but I prefer the second, because is safer, as you can create a specific target, and avoid any mistake from the customer side.
NOTE: The feature Components requires Conan >=1.27.

create the symlink failed in yocto project

Error log, do_package_qa:
QA Issue: non -dev/-dbg/nativesdk- package contains symlink .so
source code, CMakeList.txt:
SET_TARGET_PROPERTIES(${TARGETNAME} PROPERTIES VERSION 1.0 SOVERSION 1)
Whether I need to add some parameter like -dev or -dbg??
You need to either manually install those *.so files and symlinks to them in do_install() for PACKAGE_${PN}-dev, or add the following to the recipe:
FILES_SOLIBSDEV = ""
SOLIBS = ".so"

How to include the .so of custom ops in the pip wheel and organize the sources of custom ops?

Following the documentation, I put my_op.cc and my_op.cu.cc under tensorflow/core/user_ops, and created tensorflow/core/user_ops/BUILD which contains
load("//tensorflow:tensorflow.bzl", "tf_custom_op_library")
tf_custom_op_library(
name = "my_op.so",
srcs = ["my_op.cc"],
gpu_srcs = ["my_op.cu.cc"],
)
Then I run the following commands under the root of tensorflow:
bazel build -c opt //tensorflow/core/user_ops:all
bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg
After building and installing the pip wheel, I want to use my_op in the project my_project.
I think I should create something like my_project/tf_op/__init__.py and my_project/tf_op/my_op.py, which calls tf.load_op_library like the example cuda_op.py. However, the my_op.so is not included in the installed pip wheel. How can I generate the input (the path of my_op.so) for tf.load_op_library?
Is there any better way to organized my_op.cc, my_op.cu.cc, my_op.py with my_project?
You can preserve directory structure of your project and create setup.py such that it also include .so files. You can also add other non-python files of your project same way.
Example Directory Structure:
my_package
my_project
__init__.py
setup.py
You can install 'my_project' package while in my_package directory using command:
pip install . --user (Avoid --user argument if you install packages with root access)
from setuptools import setup, find_packages
setup(name='my_project',
version='1.0',
description='Project Details',
packages=find_packages(),
include_package_data=True,
package_data = {
'': ['*.so', '*.txt', '*.csv'],
},
zip_safe=False)
Don't forget to add __init__.py in all folders containing python modules you want to import.
Reference: https://docs.python.org/2/distutils/setupscript.html#installing-package-data