How to publish models vs. full project artifacts separately? - ivy

How could ivy support publishing artifacts of projects in multiple phases?
Suppose we had project A and B. A depends on B's models while B depends on A's models. (Usually the circular dependence isn't that direct, but the example serves. Our projects are relatively loosely coupled, sending messages to each other via the models) The models themselves don't depend on anything, so I can easy build those artifacts. However, while I can build moduleA-models.jar, I cannot build moduleA.jar until I get moduleB-models.jar. (And.. of course, visa-versa with module B.)
So I'm thinking a 2 phase publishing effort. I'm doing exactly that. I have an ant target that builds the models and then publishes the 'model' ivy conf. I run through all the projects building/publishing the models. I then go back and start building the rest of the project code. Note that 'going back and building the rest of the project code' implies a new publishing call... this time with all the artifacts, not just the model artifact.
However ivy is.. mildly unhappy with it. For example, it sometimes sees module A's 'published' ivy.xml with just the model jar, and then might find out later there's an updated ivy.xml for module A that has model and non-model jars in it. By and large I can get around that with 'changing="true"' dependency flag.
However, lately even that just fails for me and ivy is trying to build projects out of order and thus failing. Also I occasionally get into trouble about a missing version of a project (due again to the fact that it's seeing two different versions of a project's ivy.xml within the same build cycle).
So what's the recommended approach here? Separate ivy projects (in the same file structure) perhaps?

Why don't you structure your project to have a common module that builds and publishes the jars containing the message model classes?
├── build.xml
├── common
│   ├── build.xml
│   ├── ivy.xml
│   └── src
| ..
├── module1
│   ├── build.xml
│   ├── ivy.xml
│   └── src
| ..
└── module2
├── build.xml
├── ivy.xml
└── src
..
Each module can then have a dependency on these common dependencies:
<dependency org="myproj.common" name="module1-model" rev="1.0"/>
The root build file can use the buildlist task to determine the module build order based on the ivy file dependencies.
<target name="determine-build-order">
<ivy:buildlist reference="build-path">
<fileset dir="." includes="modules/**/build.xml"/>
</ivy:buildlist>
</target>
<target name="build" depends="determine-build-order">
<subant target="build" buildpathref="build-path" />
</target>

Related

Gradle setup for Kotlin multi-module project structure

I have a gradle project setup which looks similar to the one graphed below and I have some trouble setting up the dependencies between the different modules
┌Top Kotlin Project
│
├── Project1
│ │
│ ├── ModuleA1 (uses C1)
│ ├── ModuleA2
│ ├── ModuleA3
│ └── settings.gradle
├── ModuleB1 (uses C1)
├── ModuleC1 (library project)
└── settings.gradle
The Modules in A1,A2,A3 and B1 are basically isolated Gradle applications with their own build.gradle.kts
Module C1 is supposed to be a library module with shared code, that is used by Module A1, and Module B1.
Originally I thought I only had to declare C1 as an implementation dependency in the build.gradle.kts files of A1 and B1, like this
implementation(project(":ModuleC1"))
and include it in the settings.gradle of the Top Kotlin Project and also Project 1
include(":ModuleC1")
project(":ModuleC1").projectDir = File("../modulec1") #this line only needed in settings.gradle of Project1
However this does not work for multiple reasons.
First of all, all other projects/modules need to include all the repositories ModuleC needs, which means, whenever ModuleC changes I might need to also change the build files of other modules.
Most importantly though, all modules need the kotlin gradle plugin to build.
However if ModuleC1 includes the plugin and ModuleA1 as well, I get error that
> Plugin request for plugin already on the classpath must not include a version
If I do not include the Plugin in ModuleC1 it works for ModuleA1, but ModuleC1 cannot be built alone anymore.
The gradle.build.kts of ModuleC1 currently looks like this:
plugins {
kotlin("jvm") version "1.6.20"
`java-library`
}
group = "org.test.project"
version = "0.1"
repositories {
mavenCentral()
}
dependencies {
testImplementation("org.junit.jupiter:junit-jupiter-api:5.8.1")
testRuntimeOnly("org.junit.jupiter:junit-jupiter-engine:5.8.1")
}
tasks.getByName<Test>("test") {
useJUnitPlatform()
}
In conclusion I think, this setup of simply including the ModuleC1 in the other projects/modules is pretty flawed and there must be a better way to do this.
In summary, my goals are to be able to develop ModuleC1 independently but be able to use it in the other modules as if it was just another package there.

How to properly use bin_package in Yocto

Good day,
I am trying to unpack the files from a .tar.gz archive into my bitbake generated image.
Basically just copy some files from the archive to usr/lib/fonts
File structure is like so:
├── deploy-executable
│   └── usr
│   └── lib
│   └── fonts
│   ├── LiberationMono-BoldItalic.ttf
│   ├── LiberationMono-Bold.ttf
│   ├── LiberationMono-Italic.ttf
│   ├── LiberationMono-Regular.ttf
│   ├── LiberationSans-BoldItalic.ttf
....
This goes inside an archive called deploy-executable-0.1.tar.gz
Now my deploy-executable_0.1.bb file looks like this:
SUMMARY = "Recipe for populating with bin_package"
DESCRIPTION ="This recipe uses bin_package to add some demo files to an image"
LICENSE = "CLOSED"
SRC_URI = "file://${BP}.tar.gz"
inherit bin_package
(I have followed the instructions from this post: https://www.yoctoproject.org/pipermail/yocto/2015-December/027681.html)
The problem is that I keep getting the following error:
ERROR: deploy-executable-0.1-r0 do_install: bin_package has nothing to install. Be sure the SRC_URI unpacks into S.
Can anyone help me?
Let me know if you need more information. I will be happy to provide.
Solution:
Add a subdir parameter after the filepath (and leave ${S} alone) to your tarball to get it unpack to the right location.
E.G.
SRC_URI = "file://${BP}.tar.gz;subdir=${BP}"
Explanation:
According to bitbake docs
subdir : Places the file (or extracts its contents) into the specified subdirectory. This option is useful for unusual tarballs or other archives that do not have their files already in a subdirectory within the archive.
So when your tarball gets extracted and unpacked, you can specify that it should go into ${BP} (relative to ${WORKDIR}) which is what do_package & co. expect.
Note that this is also called out in the bin_package.bbclass recipe class file itself (though for a slightly different application):
# Note:
# The "subdir" parameter in the SRC_URI is useful when the input package
# is rpm, ipk, deb and so on, for example:
#
# SRC_URI = "http://example.com/foo-1.0-r1.i586.rpm;subdir=foo-1.0"
#
# Then the files would be unpacked to ${WORKDIR}/foo-1.0, otherwise
# they would be in ${WORKDIR}.
I ran into issues simply doing ${S} = ${WORKDIR} because I had some leftover artifacts in my working directory from a recipe from before I made it a bin_package. The leftover sysroot_* artifacts wreaked havoc on do_package_shlibs... Figured it was better to just unpack the archive where it was expected to go instead of mucking with changing ${S} for a bit of robustness.

Single CMakeLists.txt enough for my project?

I am trying to port my old CMake to modern CMake (CMake 3.0.2 or above). In the old design I had multiple CMakelists.txt, each directory contained a CMakeLists.txt file.
My current project's directory structure looks like :
.
├── VizSim.cpp
├── algo
├── contacts
│   ├── BoundingVolumeHierarchies
│   │   └── AABBTree.h
│   └── SpatialPartitoning
├── geom
│   └── Geometry.h
├── math
│   ├── Tolerance.h
│   ├── Vector3.cpp
│   └── Vector3.h
├── mesh
│   ├── Edge.h
│   ├── Face.h
│   ├── Mesh.cpp
│   ├── Mesh.h
│   └── Node.h
├── util
| |__ Defines.h
| |__ Math.h
|
└── viz
└── Renderer.h
What I was planning to do was just use a single CMakelists.txt and place all the cpp files in SOURCE and all the headers in HEADER and use add_executable.
set (SOURCE
${SOURCE}
${CMAKE_CURRENT_SOURCE_DIR}/src/mesh/Mesh.cpp
${CMAKE_CURRENT_SOURCE_DIR}/src/math/Vector3.cpp
${CMAKE_CURRENT_SOURCE_DIR}/src/VizSim.cpp
....
)
set (HEADER
${HEADER}
${CMAKE_CURRENT_SOURCE_DIR}/src/mesh/Mesh.h
${CMAKE_CURRENT_SOURCE_DIR}/src/math/Vector3.h
....
)
add_library(${PROJECT_NAME} SHARED ${SOURCE})
Doing this I am worried if using a single CMakeLists.txt is good practice. So does single CMakeLists.txt suffice or do I need a CMakeLists.txt for each folder?
I can only think of one good reason to have multiple CMakeLists.txt in my project and that is modularity.
Considering my project will grow eventually.
This is a bit long for a comment – so I make it an answer:
In one of my projects (a library), I have that many sources that I started to move some of them in a sub-directory util.
For this, I made separate variables:
file(GLOB headers *.h)
file(GLOB sources *.cc)
file(GLOB utilHeaders
RELATIVE ${CMAKE_CURRENT_SOURCE_DIR}
${CMAKE_CURRENT_SOURCE_DIR}/util/*.h)
file(GLOB utilSources
RELATIVE ${CMAKE_CURRENT_SOURCE_DIR}
${CMAKE_CURRENT_SOURCE_DIR}/util/*.cc)
To make it nice looking / more convenient in VisualStudio, I inserted source_groups which generates appropriate sub-folders in the VS project. I believe they are called "Filters".
source_group("Header Files\\Utilities" FILES ${utilHeaders})
source_group("Source Files\\Utilities" FILES ${utilSources})
Of course, I have to consider the variables utilHeaders and utilSources as well where the sources have to be provided:
add_library(libName
${sources} ${headers}
${utilSources} ${utilHeaders})
That's it.
Fred reminded in his comment that I shouldn't forget to mention that file(GLOB has a certain weakness (although I find it very valuable in our daily work). This is even mentioned in the CMake doc.:
Note: We do not recommend using GLOB to collect a list of source files from your source tree. If no CMakeLists.txt file changes when a source is added or removed then the generated build system cannot know when to ask CMake to regenerate. The CONFIGURE_DEPENDS flag may not work reliably on all generators, or if a new generator is added in the future that cannot support it, projects using it will be stuck. Even if CONFIGURE_DEPENDS works reliably, there is still a cost to perform the check on every rebuild.
So, using file(GLOB, you shouldn't never forget to re-run CMake once files have been added, moved, or removed. An alternative could be as well, to add, move, remove the files directly in the generated built-scripts (e.g. VS project files) and rely on the fact that the next re-run of CMake will those files cover as well. Last but not least, a git pull is something else that it's worth to consider a re-run of CMake.
I would always recommend a CMakeList.txt file per directory. My reasons:
locality: keep everything in the same folder that belongs together. This includes the relevant parts of the build system. I would hate it to navigate to the root folder to see how a library or target was invoked.
separation of build artifacts and related build code: Tests belong below test, libraries below lib, binaries below bin, documentation below doc, and utilities below utils. This may vary from project to project. When I have to make a change to the documentation, why should I wade through dozens of unrelated CMake code? Just have a look into the right CMakeLists.txt.
avoid handling of paths: In most cases relative or absolute paths including stuff like ${CMAKE_CURRENT_SOURCE_DIR} can be avoided. That leads to maintainable build code and reduces errors from wrong paths. Especially with out-of-source build, which should be used anyway.
localization of errors: If a CMake error occurs it is easier to locate the problem. Often a sub-directory can be excluded as a first workaround.

IntelliJ IDEA not picking up correct application-{}.properties file

I have a spring boot 1.5.1 project that uses profile properties file. In my /src/main/resources I have all my properties files
When using IntelliJ 2016.3.4 I set the
Run Configuration | Active Profile
to "local" and run it. I see this in the console:
The following profiles are active: local
But there is a value in the property file
data.count.users=2
and used as:
#Value("${data.count.users}")
private int userCount;
that is not being picked up and thus causing the error:
Caused by: java.lang.IllegalArgumentException: Could not resolve
placeholder 'data.count.users' in string value "${data.count.users}"
However, if I run this via gradle
bootRun {
systemProperty 'spring.profiles.active', System.properties['spring.profiles.active'] }
as
gradle bootRun -Dspring.profiles.active=local
then everything starts up using the local profile as expected. Can anyone see why this is not being properly picked up? In IntelliJ Project Structure I have my /src/main/resources defined as my Resource Folders.
UPDATE:
Adding screenshot of Configuration:
I could be wrong here but it doesn't look like the spring.profiles.active environment variable is actually set in your configuration, regardless of what you've selected as your Active Profile. This may be a bug with IntelliJ.
However, setting the environment variable in Run -> Edit Configurations definitely works for me.
Pease add Spring facet to your Spring Boot module to get full support
Is classpath of module heimdall the correct one, i.e. does it contain the shown resources folder with your application.properties?
If this doesn't help, please file a minimum sample project reproducing the exact structure of your project in our bugtracker, there are too many variables to investigate https://youtrack.jetbrains.com/issues/IDEA.
Using -Dspring.config.location in VM options in IntelliJ helped me.
-Dspring.config.location=file:/C:/Users/<project path>/src/main/resources/application-dev.properties
This could also be due to a non-standard configuration setup, for instance:
src/main/resources
├── application.properties
├── config1
│   ├── application-dev.properties
│   ├── application-prod.properties
│   ├── application.properties
│   └── logback-spring.xml
├── config2
│   ├── application-dev.properties
│   ├── application-prod.properties
│   ├── application.properties
│   └── logback-spring.xml
└── config3
├── application-dev.properties
├── application-prod.properties
├── application.properties
└── logback-spring.xml
This can be solved by passing using the parameters logging.config & spring.config.name for logback & spring respectively. For the above example:
java -jar \
-Dspring.profiles.active=dev \
-Dlogging.config=classpath:config1/logback-spring.xml \
-Dspring.config.name=application,config1/application \
target/my-application.0.0.1.jar
Here root application.properties is used, overridden by config1/application.properties, overridden by config1/application-dev.properties. The parameters (environment variables) can be specified in IDEA's run configuration in VM Options.
As far as advanced IDE support (highlighting, completion etc.) is concerned, there is an open issue for complex/custom configuration setups: IDEA-180498

msbuild: build as to a appxbundle (AppxBundle=Always not working)

I have a shared Windows8.1 project with a Phone and Desktop project in it. I defined different configurations to build x86/x64 for desktop and ARM for phone.
msbuild works fine without error, but there is no final *.appxbundle file on the output folder (or anywhere else) although i set the parameter AppxBundle=Always.
my command looks like this:
msbuild myApp.sln /p:OutputPath=%OUTPATH%;Configuration=Phone;Platform=ARM;AppxBundle=Always;AppxBundlePlatforms=ARM
/t:Rebuild,Publish
The output is:
OUTPATH
├── ForBundle
│ └── AppxManifest.xml
├── AppxManifest.xml
├── App.WindowsPhone.build.appxrecipe
├── App.WindowsPhone_3.2.1_ARM.appx
├── App.WindowsPhone_3.2.1_scale-100.appx
├── App.WindowsPhone_3.2.1_scale-140.appx
├── App.WindowsPhone_3.2.1_scale-180.appx
├── resources.pri
└── SomeDependency.winmd
I tried to pack this folder with makeappx.exe bundle but this didn't work and I realized the folder looks a bit different to what is into a appxbundle.
Creating a appxbundle via VS GUI is no problem, but I would like to automate that step!
Thanks in advance!
There's a hint comment in Microsoft.AppXPackage.Targets:
When building on the command line or in TFS (determined by looking at the $(BuildingInsideVisualStudio) property), if build is
invoked on an
app package-producing project, the package for the project will be produced as part of building the project without specifying
any additional
flags or targets. This is control by an MSBuild property named GenerateAppxPackageOnBuild which is set to true by default.
If $(BuildingInsideVisualStudio) = false and $(GenerateAppxPackageOnBuild) = true, then build will also produce a
package.
true
FYI, the file has moved for VS 2022, new location isL
C:\Program Files\Microsoft Visual Studio\2022\Enterprise\MSBuild\Microsoft\VisualStudio\v17.0\AppxPackage