How to properly use bin_package in Yocto - embedded

Good day,
I am trying to unpack the files from a .tar.gz archive into my bitbake generated image.
Basically just copy some files from the archive to usr/lib/fonts
File structure is like so:
├── deploy-executable
│   └── usr
│   └── lib
│   └── fonts
│   ├── LiberationMono-BoldItalic.ttf
│   ├── LiberationMono-Bold.ttf
│   ├── LiberationMono-Italic.ttf
│   ├── LiberationMono-Regular.ttf
│   ├── LiberationSans-BoldItalic.ttf
....
This goes inside an archive called deploy-executable-0.1.tar.gz
Now my deploy-executable_0.1.bb file looks like this:
SUMMARY = "Recipe for populating with bin_package"
DESCRIPTION ="This recipe uses bin_package to add some demo files to an image"
LICENSE = "CLOSED"
SRC_URI = "file://${BP}.tar.gz"
inherit bin_package
(I have followed the instructions from this post: https://www.yoctoproject.org/pipermail/yocto/2015-December/027681.html)
The problem is that I keep getting the following error:
ERROR: deploy-executable-0.1-r0 do_install: bin_package has nothing to install. Be sure the SRC_URI unpacks into S.
Can anyone help me?
Let me know if you need more information. I will be happy to provide.

Solution:
Add a subdir parameter after the filepath (and leave ${S} alone) to your tarball to get it unpack to the right location.
E.G.
SRC_URI = "file://${BP}.tar.gz;subdir=${BP}"
Explanation:
According to bitbake docs
subdir : Places the file (or extracts its contents) into the specified subdirectory. This option is useful for unusual tarballs or other archives that do not have their files already in a subdirectory within the archive.
So when your tarball gets extracted and unpacked, you can specify that it should go into ${BP} (relative to ${WORKDIR}) which is what do_package & co. expect.
Note that this is also called out in the bin_package.bbclass recipe class file itself (though for a slightly different application):
# Note:
# The "subdir" parameter in the SRC_URI is useful when the input package
# is rpm, ipk, deb and so on, for example:
#
# SRC_URI = "http://example.com/foo-1.0-r1.i586.rpm;subdir=foo-1.0"
#
# Then the files would be unpacked to ${WORKDIR}/foo-1.0, otherwise
# they would be in ${WORKDIR}.
I ran into issues simply doing ${S} = ${WORKDIR} because I had some leftover artifacts in my working directory from a recipe from before I made it a bin_package. The leftover sysroot_* artifacts wreaked havoc on do_package_shlibs... Figured it was better to just unpack the archive where it was expected to go instead of mucking with changing ${S} for a bit of robustness.

Related

ESP-IDF project with multiple source files

I started my project with a simple "blink" example and used it as a template to write my code.
This example used only one source file blink.c.
Eventually, I want to a use multi source files project and can't figure out how to configure CMakeLists.txt in order to compile the project.
My CMakeLists.txt is:
cmake_minimum_required(VERSION 3.5)
include($ENV{IDF_PATH}/tools/cmake/project.cmake)
project(blink)
I want to add for example init.c.
I tried different ways, but with no success.
None of idf_component_register() / register_component() worked for me.
Any idea how to correctly configure the project?
Right, the CMake project hierarchy in ESP IDF is a bit tricky. You are looking at the wrong CMakeLists.txt file. Instead of the one in root directory, open the one in blink/main/CMakeLists.txt. This file lists the source files for the "main" component, which is the one you want to use. It would look like this:
idf_component_register(SRCS "blink.c" "init.c"
INCLUDE_DIRS ".")
Make sure your init.c file is in the same directory as this CMakeLists.txt and blink.c.
I also recommend taking a look at the Espressif Build System documentation, it's quite useful.
You should edit the CMakeLists.txt located in your main folder inside your project folder. In addition, you need to put the directory that contains the header files into INCLUDE_DIRS parameter.
For example, if you have this file structure in your project (you're putting init.h inside include folder) as shown below:
blink/
├── main/
│ ├── include/
│ │ └── init.h
│ ├── blink.c
│ ├── CMakeLists.txt
│ ├── init.c
│ └── ...
├── CMakeLists.txt
└── ...
The content in your main/CMakeLists.txt should be:
idf_component_register(SRCS "blink.c" "init.c"
INCLUDE_DIRS "." "include")

Use module from parent directory in rust

Is it possible to structure a rust project in this way?
Directory structure:
src
├── a
│   └── bin1.rs
├── b
│   ├── bin2.rs
└── common
├── mod.rs
from Cargo.toml:
[[bin]]
name = "bin1"
path = "src/a/bin1.rs"
[[bin]]
name = "bin2"
path = "src/b/bin2.rs"
I would like to be able to use the common module in bin1.rs and bin2.rs. It's possible by adding the path attribute before the import:
#[path="../common/mod.rs"]
mod code;
Is there a way for bin1.rs and bin2.rs to use common without having to hardcode the path?
The recommended method to share code between binaries is to have a src/lib.rs file. Both binaries automatically have access to anything accessible through this lib.rs file as a separate crate.
Then you would simply define a mod common; in the src/lib.rs file. If your crate is called my_crate, your binaries would be able to use it with
use my_crate::common::Foo;

Single CMakeLists.txt enough for my project?

I am trying to port my old CMake to modern CMake (CMake 3.0.2 or above). In the old design I had multiple CMakelists.txt, each directory contained a CMakeLists.txt file.
My current project's directory structure looks like :
.
├── VizSim.cpp
├── algo
├── contacts
│   ├── BoundingVolumeHierarchies
│   │   └── AABBTree.h
│   └── SpatialPartitoning
├── geom
│   └── Geometry.h
├── math
│   ├── Tolerance.h
│   ├── Vector3.cpp
│   └── Vector3.h
├── mesh
│   ├── Edge.h
│   ├── Face.h
│   ├── Mesh.cpp
│   ├── Mesh.h
│   └── Node.h
├── util
| |__ Defines.h
| |__ Math.h
|
└── viz
└── Renderer.h
What I was planning to do was just use a single CMakelists.txt and place all the cpp files in SOURCE and all the headers in HEADER and use add_executable.
set (SOURCE
${SOURCE}
${CMAKE_CURRENT_SOURCE_DIR}/src/mesh/Mesh.cpp
${CMAKE_CURRENT_SOURCE_DIR}/src/math/Vector3.cpp
${CMAKE_CURRENT_SOURCE_DIR}/src/VizSim.cpp
....
)
set (HEADER
${HEADER}
${CMAKE_CURRENT_SOURCE_DIR}/src/mesh/Mesh.h
${CMAKE_CURRENT_SOURCE_DIR}/src/math/Vector3.h
....
)
add_library(${PROJECT_NAME} SHARED ${SOURCE})
Doing this I am worried if using a single CMakeLists.txt is good practice. So does single CMakeLists.txt suffice or do I need a CMakeLists.txt for each folder?
I can only think of one good reason to have multiple CMakeLists.txt in my project and that is modularity.
Considering my project will grow eventually.
This is a bit long for a comment – so I make it an answer:
In one of my projects (a library), I have that many sources that I started to move some of them in a sub-directory util.
For this, I made separate variables:
file(GLOB headers *.h)
file(GLOB sources *.cc)
file(GLOB utilHeaders
RELATIVE ${CMAKE_CURRENT_SOURCE_DIR}
${CMAKE_CURRENT_SOURCE_DIR}/util/*.h)
file(GLOB utilSources
RELATIVE ${CMAKE_CURRENT_SOURCE_DIR}
${CMAKE_CURRENT_SOURCE_DIR}/util/*.cc)
To make it nice looking / more convenient in VisualStudio, I inserted source_groups which generates appropriate sub-folders in the VS project. I believe they are called "Filters".
source_group("Header Files\\Utilities" FILES ${utilHeaders})
source_group("Source Files\\Utilities" FILES ${utilSources})
Of course, I have to consider the variables utilHeaders and utilSources as well where the sources have to be provided:
add_library(libName
${sources} ${headers}
${utilSources} ${utilHeaders})
That's it.
Fred reminded in his comment that I shouldn't forget to mention that file(GLOB has a certain weakness (although I find it very valuable in our daily work). This is even mentioned in the CMake doc.:
Note: We do not recommend using GLOB to collect a list of source files from your source tree. If no CMakeLists.txt file changes when a source is added or removed then the generated build system cannot know when to ask CMake to regenerate. The CONFIGURE_DEPENDS flag may not work reliably on all generators, or if a new generator is added in the future that cannot support it, projects using it will be stuck. Even if CONFIGURE_DEPENDS works reliably, there is still a cost to perform the check on every rebuild.
So, using file(GLOB, you shouldn't never forget to re-run CMake once files have been added, moved, or removed. An alternative could be as well, to add, move, remove the files directly in the generated built-scripts (e.g. VS project files) and rely on the fact that the next re-run of CMake will those files cover as well. Last but not least, a git pull is something else that it's worth to consider a re-run of CMake.
I would always recommend a CMakeList.txt file per directory. My reasons:
locality: keep everything in the same folder that belongs together. This includes the relevant parts of the build system. I would hate it to navigate to the root folder to see how a library or target was invoked.
separation of build artifacts and related build code: Tests belong below test, libraries below lib, binaries below bin, documentation below doc, and utilities below utils. This may vary from project to project. When I have to make a change to the documentation, why should I wade through dozens of unrelated CMake code? Just have a look into the right CMakeLists.txt.
avoid handling of paths: In most cases relative or absolute paths including stuff like ${CMAKE_CURRENT_SOURCE_DIR} can be avoided. That leads to maintainable build code and reduces errors from wrong paths. Especially with out-of-source build, which should be used anyway.
localization of errors: If a CMake error occurs it is easier to locate the problem. Often a sub-directory can be excluded as a first workaround.

How do I get log4j.properties to read environment variables?

I have multiple environments (dev/qa/prod) for my application. I would therefore like to differentiate the log conversion pattern based on environment. I have an env variable set which stores which environment the application is running it. But, how do I get log4j.properties to read this env variable?
This is my what my current properties file looks like:
log4j.rootLogger = INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern= [%d{yyyy-MM-dd HH:mm:ss}] my-api.%-5p: %m%n
I have tried following the log4j lookup docs, but this still does not include the environment in my log file.
log4j.appender.stdout.layout.ConversionPattern= [%d{yyyy-MM-dd
HH:mm:ss}] ${env:ENVIRONMENT}-my-api.%-5p: %m%n
The output looks like this:
[2018-01-22 14:17:20] -my-api.INFO : some-message.
But I want it to look like this:
[2018-01-22 14:17:20] dev-my-api.INFO : some-message.
You may also try a pattern that has become some sort of standard in Luminus and other frameworks. You create an env directory that holds prod/dev/test subfolders with some additional code and resources. In your lein project, for each profile you specify where to find those files in addition to the default path.
As the result, you've got three different log settings. Each of them will be loaded depending on what are you doing. When just develop the code -- from env/dev/resources/log4j.properties and when running tests -- from env/test/resources/log4j.properties.
Here is an example:
$ tree env
.
├── dev
│ └── resources
│ └── log4j.properties
├── prod
│ └── resources
│ └── log4j.properties
└── test
└── resources
└── log4j.properties
Some bits from the project.clj:
:profiles {:dev {:plugins [[autodoc/lein-autodoc "1.1.1"]]
:dependencies [[org.clojure/clojure "1.8.0"]
[log4j/log4j "1.2.17"]]
:resource-paths ["env/dev/resources"]}}
For test profile, you probably may want to specify both dev and test paths.

IntelliJ IDEA not picking up correct application-{}.properties file

I have a spring boot 1.5.1 project that uses profile properties file. In my /src/main/resources I have all my properties files
When using IntelliJ 2016.3.4 I set the
Run Configuration | Active Profile
to "local" and run it. I see this in the console:
The following profiles are active: local
But there is a value in the property file
data.count.users=2
and used as:
#Value("${data.count.users}")
private int userCount;
that is not being picked up and thus causing the error:
Caused by: java.lang.IllegalArgumentException: Could not resolve
placeholder 'data.count.users' in string value "${data.count.users}"
However, if I run this via gradle
bootRun {
systemProperty 'spring.profiles.active', System.properties['spring.profiles.active'] }
as
gradle bootRun -Dspring.profiles.active=local
then everything starts up using the local profile as expected. Can anyone see why this is not being properly picked up? In IntelliJ Project Structure I have my /src/main/resources defined as my Resource Folders.
UPDATE:
Adding screenshot of Configuration:
I could be wrong here but it doesn't look like the spring.profiles.active environment variable is actually set in your configuration, regardless of what you've selected as your Active Profile. This may be a bug with IntelliJ.
However, setting the environment variable in Run -> Edit Configurations definitely works for me.
Pease add Spring facet to your Spring Boot module to get full support
Is classpath of module heimdall the correct one, i.e. does it contain the shown resources folder with your application.properties?
If this doesn't help, please file a minimum sample project reproducing the exact structure of your project in our bugtracker, there are too many variables to investigate https://youtrack.jetbrains.com/issues/IDEA.
Using -Dspring.config.location in VM options in IntelliJ helped me.
-Dspring.config.location=file:/C:/Users/<project path>/src/main/resources/application-dev.properties
This could also be due to a non-standard configuration setup, for instance:
src/main/resources
├── application.properties
├── config1
│   ├── application-dev.properties
│   ├── application-prod.properties
│   ├── application.properties
│   └── logback-spring.xml
├── config2
│   ├── application-dev.properties
│   ├── application-prod.properties
│   ├── application.properties
│   └── logback-spring.xml
└── config3
├── application-dev.properties
├── application-prod.properties
├── application.properties
└── logback-spring.xml
This can be solved by passing using the parameters logging.config & spring.config.name for logback & spring respectively. For the above example:
java -jar \
-Dspring.profiles.active=dev \
-Dlogging.config=classpath:config1/logback-spring.xml \
-Dspring.config.name=application,config1/application \
target/my-application.0.0.1.jar
Here root application.properties is used, overridden by config1/application.properties, overridden by config1/application-dev.properties. The parameters (environment variables) can be specified in IDEA's run configuration in VM Options.
As far as advanced IDE support (highlighting, completion etc.) is concerned, there is an open issue for complex/custom configuration setups: IDEA-180498